The Future Is Now: Massachusetts General Hospital Embraces Deep Learning

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon
the-future-is-now2.jpg - the-future-is-now2

Deep learning, artificial intelligence (AI) and automation are gaining more and more momentum in radiology. While some physicians are slow to embrace this vision and trend, fretting over their own job security, others in the industry are inspired by the endless possibilities. The long-term vision positions AI at the center of momentous change in radiology while also pushing the practice of medicine, disease management and physician efficiency forward at a rapid pace.

Massachusetts General Hospital (MGH) in Boston has clearly chosen the quest for deeper learning and thinking and the benefits AI can bring. Last year, MGH made history by becoming the first medical facility in the world to install an NVIDIA DGX-1 AI supercomputer, “the world’s first purpose-built system for deep learning and AI accelerated analytics, delivering performance equal to 250 conventional servers.”

While the supercomputer is roughly the size of a piece of vintage stereo equipment, MGH believes it has unlimited potential, proving that big things do indeed come in small packages.

“It’s almost like thinking that someone has just deployed electricity, so you’re back in the days of Thomas Edison and now electricity is available,” says Keith Dreyer, DO, PhD, vice chairman and associate professor of radiology at MGH and Harvard Medical School. “You go, ‘Let’s see, we used to scrub clothes by hands, but now we can use a motor and a crank and make a washing machine. Or we could make an air conditioner. Or a television. This really is going to change everything.”

Enabling Radiology and Radiologists

James Brink, MD, head of radiology at MGH, says acquiring the supercomputer was the final step of a long process. Researchers from the hospital’s radiology department first approached Brink about machine learning in 2015, and the MGH Center for Clinical Science was eventually established within the department.

"Having radiologists be directly involved in this research is so critical because we know the best use cases that are going to be enabling for radiology and radiologists to improve the efficiency of what we do,” Brink says.

NVIDIA delivered the supercomputer to MGH’s Ether Dome, named for being where anesthetic was first used in surgery back in 1846, and the Center for Clinical Data Science and its research team immediately began investigating what this new technology could do. The plan includes testing and implementing news ways to improve the detection, diagnosis, treatment and management of disease by training a deep neural network using MGH’s database of approximately 10 billion medical images. To process that large amount of data, researchers will utilize algorithms crafted by both MGH data scientists and NVIDIA engineers.

One of the first ways this technology could make an impact on radiology, Brink and Dreyer agree, is by helping specialists with quantification.

“One thing radiologists typically struggle with is measuring tumor and lymph node sizes and tracking those measurements over time,” Brink says. “Having tools that help with that quantification—which, to date, have been somewhat elusive—would really enable robust tracking over time.”

Dreyer, who serves as the executive director of the Center for Clinical Data Science, adds that there are two categories of quantification that will be affected: the quantification humans already do and the quantification of things humans don’t do currently because it is too difficult or takes up too much time.

“If a patient had lymphoma and I was looking at images before and after treatment and I wanted to say what effect the treatment had, imagine the effort it would take to go through every lymph node and measure the exact size down to the cubic millimeter,” Dreyer says. “One could imagine a system using artificial intelligence that could do something like that for us.”

MGH also sees potential for improvements in workflow.

“Other use cases may be in prioritizing worklists—having machine learning tools that identify cases that might have a pneumothorax or might have an abscess,” Brink says. “We could move those types of cases to the front of a worklist for review and evaluation by a trained radiologist.”

As Dreyer points out, MGH’s radiologists read more than 2,000 cases a day, “so it would be nice to be able to know which cases have the highest priority—not just the ones that came out of the scanner first or were at the top of my reading list, but the ones that had some pathology there.”

Deep learning and