An artificial intelligence algorithm helped to catch physician misses in clinical chest CT reports, according to a new analysis published Wednesday in Academic Radiology.
Several studies have detailed rad error rates when interpreting such images, for reasons including the large number of images reviewed and shorter reading times due to heavy workloads. Underreporting of findings on CT exams is relatively common, noted experts from several institutions led by the Medical University of South Carolina.
Wanting to address this challenge, researchers conducted an experiment to see whether a deep learning algorithm could read rads’ reports and spot any overlooked concerns. Processing 100 chest CT reports using AI, Basel Yacoub, MD, and colleagues found success, with the algorithm showing “superior diagnostic performance” in spotting two particular concerns.
“The findings of this study confirm the significant added value of AI to clinical radiology reporting,” Yacoub, with MUSC’s department of radiology, and co-authors wrote March 10. “In the future, AI platforms may provide additional support to radiologists, particularly in the face of an increasing workload due to a rising number of imaging examinations performed each year.”
To reach their conclusions, Yacoub et al. identified images from 100 consecutive patients previously imaged using noncontrast chest CT in 2019. Radiology reports signed by attending cardiothoracic rads were manually scrutinized to look for five conditions—pulmonary lesions, emphysema, aortic dilation, coronary artery calcifications, and vertebral compression fractures. The scans were then processed using the prototype AI platform, and two board-certified radiologists subsequently compared physician reports up against the algorithm’s findings.
AI ended up showing stronger diagnostic performance in spotting aortic dilation along with calcifications, while physicians excelled at pinpointing pulmonary lesions and fractures. Docs and the algorithm performed about the same when locating pulmonary emphysema, the authors noted. Based on their findings, researchers believe such AI-based reading support can help relieve busy physicians, but not replace them. They envision an “expanding role” for such platforms in clinical radiology workflows as they become trained on growing numbers of datasets
“This role will be focused on supporting radiologists in their practice by providing automated second-reader results made available when reporting the imaging scans,” Yacoub and colleagues wrote. “We expect that adding AI to radiology reporting will translate into improved interpretation and increased confidence with shorter reading time, ultimately reducing burnout among radiologists.”
You can read much more of their results in Academic Radiology here. Other institutions contributing to the analysis included Siemens Healthineers, the German Center for Cardiovascular Research, and the universities of Basel, Switzerland and Frankfurt, Germany, among others.