Examining AI’s Impact on Breast Imaging

In medical imaging, artificial intelligence (AI) has been used to assist radiologists in the diagnosis of various medical conditions, including breast cancer. And as it continues to gain traction, many within the industry expect these technologies will continue to optimize workflow and help detect breast cancer more accurately and efficiently. 

Screening mammography is still considered the most effective method for diagnosing breast cancer and advancing patients to appropriate treatment as quickly as possible. But mammograms are not always 100 percent accurate and can produce false-positive or false-negative results. Advances in radiologic imaging can play an important role in reducing the rates of false-positive and false-negative results with potential improvements in outcomes and reduced healthcare costs. 

CAD, for example, can help reduce the risk of misdiagnosis and improve radiologists’ detection performance by drawing attention to potential abnormalities. It provides radiologists with information and insight that can guide them to more accurate and faster confirmation of the presence of breast cancer. Clinical practices also have rapidly adopted digital breast tomosynthesis (DBT) in recent years because of its many advantages, including decreased call back rates and significant increases in cancer detection. 

As impactful as both CAD and DBT have been in pushing breast cancer treatment forward, AI and deep learning technologies have the potential to take things to an entirely new level—as long as radiologists are prepared to step up and adapt to this brave new world instead of being afraid or intimidated. 

AI in Action

There are examples all around us of AI being used to improve care for breast cancer patients. For instance, a concurrent-read CAD solution based on deep learning technology was recently introduced that helps radiologists identify potential mammographic abnormalities more quickly. It uses a detection algorithm to automatically analyze each tomosynthesis “slice” and identify suspicious areas. Potentially suspicious areas are then naturally blended onto a 2D synthetic image to provide radiologists with a single enhanced image that is used to navigate the large DBT data set. This allows specialists to read mammogram results more quickly and efficiently. 

As DBT gains momentum, it is also becoming increasingly difficult for radiologists to compare mammography images from year to year; changes of otherwise benign appearing tissue from year to year can be the only sign of a suspicious change. Comparing hundreds of images from DBT to four images from 2D mammography can become difficult and time consuming. Current DBT systems create a set of “synthetic, 2D-like” images that are more directly comparable to prior 2D exams. When the prior exam is DBT, it becomes even more difficult to compare hundreds of current images to the same number of prior images. In addition to the synthetic images created by the mammography system, AI options such as this concurrent-read 3D-CAD solution can help solve this issue by creating a more informative 2D synthetic enhanced image. 

While many AI systems are geared toward improving the radiologist experience in reading individual exams, others are focused on the broader picture. New technology can automatically collect data from each of the mammography systems (2D and 3D) within a facility, or even across multiple facilities in a healthcare system, and subsequently analyze multiple parameters of each image including system utilization, radiation dosage, compression force, pressure, breast density and positioning. The data can be retrieved from the broad system level to the level of an individual staff member (technologist) or piece of equipment. From this, the users can proactively identify potentially faulty imaging equipment that may be resulting in excessive radiation or even underutilized equipment which could uncover workflow issues and even which pieces of equipment are considered less desirable by staff. This is all yet another example of the potential power of AI and how radiology can push to get the most out of these evolving technologies.

The Power of Collective Intelligence

Despite these advances, it is important to note that the use of sophisticated technology can still lead to challenges associated with reader variability among radiologists when interpreting mammography exams. Radiologists may use AI systems to help in their decision-making process, but interpreting mammography exams is ultimately subjective. Different radiologists could potentially provide a different mammography result for the same patient. A new form of AI called swarm AI, or collective intelligence (CI), could play a role in improving consensus in assessing mammography results. According to a recent study published in PLoS One, radiologists using CI reduced cases of false positives and false negatives when interpreting mammograms (PLoS One.  2015 Aug 12;10(8):e0134269). For each mammogram, CI intelligence used a predetermined set of rules to aggregate the independent assessments of multiple radiologists into a single decision—that is, whether or not to recall a patient for additional study.  

The study findings indicate that CI could potentially improve medical decision-making in a wider range of contexts, including many areas of diagnostic imaging and, more generally, diagnostic decisions that are based on the subjective interpretation of evidence.

Looking Ahead

Future applications of AI in breast imaging are many. Image quality analysis can move to the front lines where prospective quality and radiation dosage alerts can be provided to the technologist during the exam. AI systems can use multiple data points such as CAD, breast density, patient history and risk to distribute cases to interpreting radiologists based on level of difficulty to potentially get the right case to the right radiologist at the right time of day (i.e. reading the hardest cases early in the day and evenly distributing those cases such to eliminate fatigue). Automatic selection of relevant prior information from prior imaging exams, even from different modalities, as well as retrieval of ancillary data from the patient’s medical record will become possible. Radiologists typically follow a logical flow from observation to assessment to recommendation, especially in mammography, where there is a very manageable, finite set of imaging tools, findings and follow-up alternatives. This will allow for reporting systems to offer predictive report generation based on individual patient data, imaging findings, age/risk based follow-up guidelines and machine learning of an individual radiologist’s decision processes.

There is virtually no risk that AI will replace the radiologist any time soon in assessments of imaging. But as this technology advances, it is possible some individual tasks currently performed by radiologists could be handled by AI systems with proper monitoring. Given the convergence of a growing number of imaging procedures, an aging population and a limited number of radiologists, the role of AI systems to assist in the radiologist workload will only continue to grow in radiology and the broader medical community. Despite the innate ability of computers to analyze vast amounts of data and to support human decision-making, of course, until scientists give can come up Artificial Empathy technology, a computer will never take the place of a doctor who holds the hand of a patient while telling them they have breast cancer. 

Bruce Schroeder, MD, is the founder and medical director of Carolina Breast Imaging Specialists in Greenville, N.C.