Medical malpractice and AI: 4 response strategies when rejecting an algorithm’s advice backfires

As artificial intelligence’s role in radiology grows, physicians must contemplate their response strategies if advanced AI use results in poor outcomes or litigation, experts charged Tuesday.

It seems “inevitable” such models will eventually outperform human providers and become the standard of care. And transparent or interpretable AI offerings with sound rationales should introduce little risk of malpractice liability, when followed properly, scientists argued in the Journal of the American College of Radiology.

“What might cause physicians considerable anxiety in the age of [advanced artificial intelligence], however, is their intentionally and knowingly rejecting an AAI’s findings or recommendations,” John Banja, PhD, a professor and medical ethicist with Emory University in Atlanta, and co-authors wrote Feb. 1. “In the event of technology-physician disagreement, clinicians should be able to defend themselves with certain strategies such as the following.”

Banja and co-authors offered four strategies to help physicians navigate such situations. Here they are in brief:

1. Deploy an adjudicator algorithm that can help settle a radiologist-AI conflict. Providers might take a page from meteorology, where the best forecasts utilize multiple algorithms to form a consensus opinion. Radiologists also might consider adopting a “tiebreaker model” to settle sticky situations. And aligning with professional standards on AI use, issued by organizations such as the ACR, is also wise and likely to strengthen one’s malpractice defense.  

2. Involve the patient by utilizing an informed consent or shared decision-making approach. Discussing the clinical ambiguity arising from an algorithm, and securing the patient’s understanding, could help avoid plaintiffs later alleging they were left in the dark.

“Some physicians might think this recommendation reflects an admission of professional weakness and fear it could arouse patients’ anxiety,” the authors wrote. “From an ethical perspective, though, there not only seems nothing morally wrong with this strategy, but many patients might admire their physician’s honesty and truthfulness, increase their trust accordingly, and contribute significant observations about how they view the risks and benefits of the treatment options being considered.”

3. Work with AI vendors to develop contractual agreements stipulating how liability will be assigned in the event of a verdict or settlement. Certain adverse events may perplex parties as to who is at fault. Such situations can be “next to impossible” to resolve when multiple designers, hardware manufacturers, coders, programmers, etc., are involved. Previous literature has recommended reaching “comprehensive contracts,” detailing who is responsible in the event of a poor outcome.

“Of course, the devil will be in the details of these arrangements,” Banja and co-authors wrote. “But working them out in advance of any allegation of patient harm might eliminate expensive, protracted, and rancorous debates when a settlement or jury award is announced.”

4. In the case of autonomous models that make their own decisions, ask the vendor to assume total liability for their product’s incorrect decisions. If AI technology is so advanced that the radiologist is completely removed from the equation, “it would be unfair to enjoin them in malpractice proceedings,” the authors argued. They gave the example of the firm IDx, which offers a tool that alerts providers of the possibility of diabetic retinopathy. The company has echoed these concerns and assumes liability for its product’s decisions, but not for any care management issues that subsequently stem from them.

You can read much more of their advice in the JACR here.

Marty Stempniak

Marty Stempniak has covered healthcare since 2012, with his byline appearing in the American Hospital Association's member magazine, Modern Healthcare and McKnight's. Prior to that, he wrote about village government and local business for his hometown newspaper in Oak Park, Illinois. He won a Peter Lisagor and Gold EXCEL awards in 2017 for his coverage of the opioid epidemic. 

Trimed Popup
Trimed Popup