Tools with a murder license

22 октября, 2021 от Kinok Выкл

My posts about the failures of the medical II caused two types of reactions (in the spirit — a glass is half empty or half full).

the glass is half blank:

Glass half full:

Both are true.

But such a interpretation of the question of the risks of medical AI is only the vertex of iceberg, which closes the main problem:

Compare two Cases: Medical AI and autonomous lethal weapon.

Attempts to limit the right of autonomous weapons on an independent decision about murder are rather ritual. Everyone understands that as a result, the question will simply be translated into the same plane as it was with home-based rockets —

And if, God forbid, the rocket will mistakenly hurt a bus with schoolchildren, it will be just a tool error for which there is no one (any automation is impossible without instrumental errors).

But if you put a question otherwise.

But since it does not happen unmistakable automation, the tool may make mistakes. And in medicine, quite often the price of the instrumental mistake is the same as in war or when combating terrorism, — the lives of people.

Imagine such a nightmare.

Or another nightmare

One of the most important EPIC algorithms is designed to predict

EPIC argues that predictions made using their model

And now the results of tests of 38,455 patients, of which 2552 suffered from sepsis, published [

Another study conducted by STAT (

So why is hundreds of hospitals use ESM as patient monitoring tool?

As noted

His words comments

Comparison of medical and autonomous weapons cases — not a clictant stretch.

I believe that there are only two ways to minimize such errors:


Yours