AI: Radiologist’s friend or foe?

The profession of radiology may rightly regard 2017 as an extended coming-out party for AI within the specialty. At ACR’s annual meeting in May, the keynote speeches all revolved around the changes AI will bring. AI occupied an entire quadrant of space, including a dedicated stage, at the RSNA annual meeting in the fall. Seemingly dozens of startups, along with numerous established companies, lined up in vendor booths ready to dazzle you with the next generation of radiology technology. 

It is now clear that the question is not whether or not AI will be a part of the radiology landscape. Rather, the question is in what capacity it will be deployed.

Will AI be an adjunct tool designed to make a radiologist’s life easier? This may be a convenient stance for AI companies to take at this stage in the game. Gaining radiologist acceptance is the most direct route to have AI implemented—and the easiest way to gain acceptance in the general medical community.

From a legislative perspective, it is easier to gain approval from the government when AI is billed as a tool designed to help the radiologist. For example, AI can be seen as a sort of next-generation CAD software, and FDA approval is much simpler in this context than getting approval for a system designed to run autonomously.

As a tool for the radiologist, the promise of AI is attractive. Imagine an AI system that can automatically characterize and score a prostate lesion by MRI. The results are presented to the radiologist, who decides to either accept or reject the results. Then the radiologist generates a final report based on a combination of his or her judgments and the conclusions of the software.

In this capacity, the role of the radiologist is modified. In the traditional model, the radiologist is responsible for identifying the findings and then subsequently rendering an opinion on what those findings represent. In this example of radiologist-AI collaboration, however, the radiologist is relieved of the duty of making the findings. He or she is responsible for just the interpretation.

The more complex task of interpreting the finding is arguably a better use of the radiologist’s time, compared to the more basic chore of identifying lesions. In fact, more and more studies are showing AI routinely outperforming humans in identifying lesions, so we should be better off in the future by relegating these tasks to the computer anyway. Another benefit is the time saved by offloading the task of identifying lesions allows for an increase in the number of cases a radiologist can review.

Or consider a system that can automatically flag studies with urgent findings, placing them at the top of the queue for the radiologist to review immediately. This allows evaluation of critical findings immediately, improving on a traditional chronological order system. A chest x-ray that has a tension pneumothorax can be attended to immediately as opposed to buried in a list of 100 x-rays to be read from the top down. This triage system has the same benefits as a triage nurse has in determining which patient should be seen immediately in the emergency room.

Many of the headlines enthusing over AI tout its role in this type of diagnostic capacity. However, there are potential benefits to AI in a non-diagnostic role as well.

Would you welcome a system that could learn the best hanging protocol on your PACS for specific types of cases? Even better, how about a system that learns what your preferred hanging protocols are for a specific case? For a chest x-ray on a patient being seen in the emergency room, the most recent prior studies are typically most important to have in assessing interval changes in the acute setting. However, for an ICU patient being monitored for pulmonary edema, in addition to the most recent prior studies, one study from a week prior may be useful to gain a bigger picture regarding overall changes in a patient’s status. This may all be possible with AI.

The role of radiologists in this paradigm has been compared to the modern airline pilot, and this comparison may not be too far off. Most of the actual flying of modern large airplanes is handled by two and sometimes three separate autopilots. In some systems, the autopilot can even land the plane by itself. Does this mean the pilot is redundant? Not at all. The pilots are constantly monitoring route and weather conditions and communicating with the control tower, all of which might necessitate a change in plans. The more rudimentary task of physically flying the plane is handled by the computer, which allows the pilot to handle the more complex tasks at hand. This is not dissimilar to the case of radiology, with AI handling the tasks of making the findings and allowing the radiologist to focus full attention on interpretation.

Unless of course, the full potential of AI results in the end game that some radiologists fear.

If AI is able to outperform humans in detecting lesions, is it conceivable that it will also eventually outperform humans on the interpretation aspect? Staying with the prostate MRI example, there are AI systems already well into development that can characterize findings as benign or malignant. It will surely only be a matter of time when the technology can make these characterizations as well as—and ultimately better than—humans. In fact, researchers at UCLA showed that teaching an AI system what characteristics prostate cancers have using clinical, imaging and pathology information can result in an algorithm that can predict cancers at least on par with subspecialist genitourinary radiologists. (See “Detection of Prostate Cancer Based on Multi-Parametric Regional MRI Features,” Nelly Tan, MD, et al., SIIM 2016 scientific session abstract.)

It’s only a matter of time before AI will outperform these subspecialists.

Where does this leave the radiologist? In the short term, the changes that will arise from AI will be comple-mentary to the radiologist. As AI is still in its infancy in terms of its clinical role, it will by necessity initially have a supporting role in the radiology world. It can help with workflow by triaging studies with urgent findings and improving how images are presented to the radiologist. As AI continues to become more sophisticated, it will take more of the role of finding abnormalities that presumably will increase the capacity of the radiologist. To that end, ACR has created the Data Science Institute to encourage the development of machine learning to promote its role in these capacities.

In the long term, the role of the radiologist is less clear. As AI becomes more powerful, the specialized roles of specific algorithms currently available will give way to more generalized systems that can process all subspecialties of radiology, integrating data from medical records, pathology and genomics.

Is it conceivable that AI could handle all of what a radiologist today is currently tasked to do? Possibly, and perhaps even more. But when you think about it, this is something that can potentially be said about any medical specialty—and, even more generally, any profession at all.

Dr. Lee is a clinical associate professor of radiology at Thomas Jefferson University and section chief of neuroradiology at Philadelphia-based Einstein Healthcare Network.