With few exceptions, the most attention-demanding discussions about how and when artificial intelligence will transform radiology have been led by—and largely held within—the academic sector. That’s not surprising, given that teaching radiologists are the ones doing the research, blazing the trails and comparing the notes. But the headiness of the conversation begs the question: Are community radiology practices doing enough to prepare for, keep up with or otherwise enter into the here-and-coming age of AI?
At this point, the only evidence available to help answer that question is anecdotal. However, given that the profession as a whole is up against myriad myths and misconceptions about AI and deep learning (DL), it’s safe to assume that nonacademic radiology practices taking a wait-and-see approach are at higher risk of misreading the tea leaves and, in turn, falling far behind.
Several experts tell RBJ a good place for practices to start is simply separating the real from the theoretical.
“There’s a lot of noise out there, but if tools” that incorporate AI “improve patient care—and they will—all radiologists will want and need to use them,” says David Phelps, MD, president of 180-rad Radiology Associates of North Texas, one of the largest private practices in the U.S. “They would do well to keep that in mind.”
But what practitioners shouldn’t want—or feel compelled to do—is buy into the myth that AI and DL are “some type of magical hocus-pocus or potion” that they must swallow immediately lest they be swallowed by it, asserts Paul Chang, MD, professor of radiology and vice chair of radiology informatics at UChicago Medicine. Chang says that, when it comes to AI and DL, radiologists have embraced the earliest phase of the “Gartner Hype Cycle,” a branded tool that was developed by IT consultancy Gartner as a representation of the stages through which individual technologies go, from conception and introduction to large-scale mainstream adoption.
“It is not unusual to see the healthcare market buy into the hype and the technology very early, because that is just the nature of how things are done,” Chang says. “But we need to be more realistic than that.”
To help with that, here’s a reality check on some AI misperceptions to which no practice is completely impervious.
AI and DL are ready for prime time.
REALITY: Several challenges must be overcome before AI and DL fall under the mainstream technology umbrella.
Lack of available datasets and cases tops the list of formidable hurdles ahead. AI and DL tools utilize algorithms to perform tasks like discerning patterns in images, identifying specific anatomical markers and pointing out details not visible to the human eye. In order for these algorithms to do their job and for AI/DL technology to serve their intended purpose, systems must be fed and trained with a wealth of annotated, vetted data from many thousand cases and disparate sources.
“Those cases just aren’t there yet,” Phelps says. “Progress has been made in developing and training the algorithms, but the process is going to take time.”
The interoperability required to feed and train the systems is also lacking at the moment, Chang observes. “Feeding and training cannot occur when the data infrastructure is week, and when systems cannot yet talk to each other,” he explains.
An absence of standards sits equally high on the list of AI/DL implementation challenges, says Alabama radiologist Bibb Allen Jr., MD, who serves as chief medical officer of the ACR’s Data Science Institute. Allen points out that the development of standards that facilitate easy integration of data necessary to feed and train AI/DL would simplify this process. He envisions a propelling of AI/DL forward through interoperability in much the same way the DICOM standard broke down barriers to image acquisition and transmission.
An article published in the March 2018 edition of JACR further explores the case, standards, and integration issues. In “Artificial Intelligence and Machine Learning in Radiology: Opportunities, Challenges, Pitfalls and Criteria for Success,” James Thrall, MD, and colleagues at Harvard and Mass General write that access to large numbers of proven cases is necessary not only to properly feed and train AI and DL algorithms but also to validate them. Additionally, they claim, standards “still need to be developed that address the curation of images,” because if image data corruption occurs in transmission or storage or during processing, duplicating work and confirming its validity will prove difficult.
Moreover, the authors point to the high variability in imaging protocols between institutions and imaging practices and “even variability in the execution of a given protocol” within an imaging services entity as potential impediments to the development and use of AI in day-to-day radiology practice. “For example,” they write, “subjective analysis of a CT scan may be somewhat tolerant of the timing of contrast material administration, and variable timing can create an ‘apples-to-oranges’ problem for AI programs that rely on quantitative factors.”
Both RSNA and ACR have undertaken initiatives aimed at scaling some of these obstacles. Under the aegis of its Radiology Informatics Committee, RSNA served as a leader in the development of RadLex, a lexicon of radiological terminology, as well as in aggregating structured reporting templates through its Radiology Reporting Initiative. Such moves, write Allen and Geraldine McGinty, MD, MBA, in an article published in the March 2018 edition of JACR, will open the door for radiologists to incorporate data elements used in AI into a “common data elements lexicon and structured reporting tools that will collect the output for AI algorithms into reporting tools.”
AI and DL will steal radiologists’ jobs.
REALITY: AI and DL will help radiologists excel in their clinical work, improve their process efficiencies and better do their part to improve care quality while holding the line on costs.
There can be no denying that AI and DL will change the way radiologists practice, but “it’s not a case of AI and DL against us,” Allen says. “The need for our expertise will always be there, so it’s a case of radiologists with AI and DL, including private practitioners, versus those who don’t have the technology in place. Those who don’t have the technology in place will be at a disadvantage,” and not just from the standpoint of the enhanced diagnostic accuracy that can be achieved through AI and DL.
Both Allen and Phelps believe AI will be of great value to the radiology specialty in that it will facilitate workflow optimization to prioritize cases. “With AI, we would more easily be able to identify situations where patients’ condition would warrant pushing their exam to the head of the queue,” Phelps says, citing as examples pneumothoraces and strokes.
Allen says that, in addition to enabling case prioritization, AI will streamline workflows for private practitioners while simultaneously supporting a higher caliber of patient care. To do so, the technology will conduct discrete assessments that can then be validated and augmented by radiologists and subsequently added to patients’ EHRs.
Thrall et al. deem AI a means through which radiologists can grapple with “observer fatigue,” which is of particular relevance for screening applications in which the likelihood of finding a “true positive” is minimal. Here, they write, AI programs optimized for high negative predictive value could identify “enriched” subsets of cases that likely harbor any existing true-positive cases for early review.
Additionally, the authors point out, AI applications will contribute to better patient care and serve to benefit practitioners by offering a new means of extracting previously unavailable information from images. “The age of big data and DL has spawned the concept of ‘radiomics,’ wherein hundreds of abstract mathematical features of images can be defined or detected through AI correlated with other data on genomics or response to therapy,” they note.
So many obstacles separate AI/DL in theory from AI/DL in practice that there isn’t much the average private practitioner can do with the technology right now.
REALITY: While radiology as a specialty is a long way from widespread AI/DL implementation, private practices and academic departments should begin laying the groundwork for the technology today.
Chang favors a scenario in which clinicians work cooperatively with their organizations’ IT teams, shoring up the IT infrastructure to support not only machine intelligence and DL, but also big data and advanced cloud-based clinical decision support solutions.
Chang also recommends that practices invest in, or at minimum prepare to invest in, beefing up data security and privacy. This, he says, is the only way to accommodate the increased data access and interoperability required to utilize AI, DL, big data and analytics tools without worrying about data stores getting breached.
Phelps urges radiologists to prepare themselves for the time when AI truly takes hold. How? By reading journals, attending conferences, availing themselves of any resources provided by ACR and RSNA and keeping abreast of pilot programs.
Among these programs, Palmetto, Fla.-based Strategic Radiology—of which Radiology Associates of North Texas is one of 26 member practices—is a participant in IBM’s Watson Health Medical Imaging Collaborative. Thirty radiologists from among seven Strategic Radiology practices are assisting in the collaborative’s efforts to advance natural language processing capabilities intended for use in future cognitive medical imaging solutions.
Other practices around the country are starting to use apps cleared by the FDA that range from autoseg-menting cardiac MRI images and machine learning-powered CAD for evaluating breast abnormalities to analyzing CT images and notifying providers that a patient might be having a stroke. Further, AI algorithms are already tracking lung cancers via CT as well as tumors and potential liver cancers in MRI and CT scans.
The practitioners’ role involves providing expert annotation of de-identified medical images, which IBM will employ to train Watson to recognize various medical imaging terms found in patients’ electronic medical records (EMRs). Participating radiologists are also sharing feedback on solutions design.
Radiology is on its own in its unique reckoning with the march of AI.
REALITY: Radiology-specific AI has friends in influential places.
The American College of Radiology’s Data Science Institute (ACR DSI), formed in May 2017, is working with radiologists, industry, government entities and other stakeholders toward attaining four goals. These are intended not only to assist the radiology specialty in jumping AI hurdles but also to enhance radiologists’ value as the technology evolves, DSI chief medical officer Bibb Allen, MD, tells RBJ.
Formulating use cases and work-flow integrations that improve patient care ranks among these objec-tives, as does “protecting patients through leadership roles in the regulatory process with the FDA and other government agencies for verification of algorithms,” Allen says.
Third on the institute’s list: establishing industry relationships by devising and providing verification processes and pathways for clinical integration.
Finally, the Institute has set as an objective the education of radiologists, as well as other clinicians and stakeholders, about AI and on ACR’s role in data science for the benefit of all patients.
For its part, the FDA may be signaling that it wants to make it easier for industry to get approval for some AI and DL elements. In early February, the agency approved the marketing of a clinical decision support tool intended to help radiologists more rapidly determine whether a patient has experienced a stroke. Developed by Viz.AI, the product uses a form of AI to analyze CT images of the brain to detect indicators associated with stroke, then sends a text alert to a neurovascular specialist if a suspected large vessel blockage is detected.
According to a statement issued by the FDA, the tool was approved in a De Novo pre-market review pathway, which is used for medical devices deemed by the agency to pose a low to moderate risk and to have no legally marketed predicate device on which to base a determination of “substantial equivalence.” In the statement, FDA says this action “creates a new regulatory classification, which means that subsequent computer-aided triage software devices with the same medical imaging intended use may go through” its pre-market notification (510 (k)) process. In so doing, vendors can obtain marketing authorization for devices by “demonstrating substantial equivalence to a predicate device.”
There will not be much use for AI and DL beyond the clinical realm.
REALITY: Applications for AI and DL in radiology will transcend the ones to be leveraged in imaging examination and reading rooms, likely impacting practices’ bottom line.
Chip Hardesty, chief operating officer of Radiology Ltd. in Tucson, Ariz., foresees a day when AI is embedded in software used on radiology practices’ business side, serving business-predictive purposes. For example, he anticipates the development of billing solutions with an AI component that assesses the likelihood of receiving out-of-pocket payments from patients based on previously identified patterns. This, Hardesty says, will allow any potential collection obstacles to be handled before they become nettlesome issues.
Hardesty also expects private radiology practices will use AI to predict whether a particular patient will keep his or her appointment and, in turn, help address scheduling in proactive manner.
“If AI can be used in other businesses, it has applications in the business of radiology,” he says. “It may be a long way, but we’ll get there.”