Couverture de #38 - Using AI Can Make You Look More Guilty In Court

#38 - Using AI Can Make You Look More Guilty In Court

#38 - Using AI Can Make You Look More Guilty In Court

Écouter gratuitement

Voir les détails

À propos de ce contenu audio

What happens when AI spots a dangerous finding on a scan and the radiologist disagrees? In theory, “human in the loop” sounds like the safeguard that keeps patients safe. In practice, it raises a far more uncomfortable question: when clinicians override AI, are they exercising sound judgment or exposing themselves to legal risk?

We explore how AI image-reading tools are reshaping radiology and why performance metrics like “96% accurate” can be misleading in real clinical settings. False positives and false negatives do not carry the same consequences, and rare diseases can sharply reduce the real-world value of even highly capable models once prevalence and positive predictive value are taken into account. As these systems flag more normal scans, a new form of defensive medicine can emerge—one where repeatedly rejecting AI recommendations begins to feel professionally dangerous, especially when those recommendations are documented in the patient record.

We also examine a study that placed laypeople in the role of jurors during malpractice scenarios involving missed diagnoses such as brain bleeds and lung cancer. The findings are revealing: when AI detects the pathology and the radiologist does not, jurors are more likely to assign blame. But when both the AI and the radiologist miss the finding, the physician gains little protection. The episode closes with what may actually reduce harm, including better education about the limitations of AI and a clearer understanding of these systems as imperfect clinical decision support—not a flawless second expert beside the clinician.

References:

Randomized Study of the Impact of AI on Perceived Legal Liability for Radiologists
Bernstein, et al.
NEJM AI


Credits:

Theme music: Nowhere Land, Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 4.0
https://creativecommons.org/licenses/by/4.0/

Aucun commentaire pour le moment