Forget Your Gut Feeling. What Does Your Deep Data Tell You?

Increasingly pressured to “show their value,” radiologists have a resource to light the way that earlier generations of physicians couldn’t have imagined: data-driven performance metrics. In the early days of Big Data in radiology, it was enough to glean a basic sense of how well a practice stacks up against peers and competitors. That was fine as far as it went. But today’s data analytics allow a deeper dive to practices and departments willing and able to track activities that add value in more sophisticated ways.

“Data analytics is touching every aspect of the practice of radiology,” says radiologist Rasu Shrestha, MD, MBA, chief strategy officer and EVP at North Carolina-based Atrium Health. “Metrics like turnaround times and RVU counts are important, and we should be tracking them, but we need to go beyond that as well.”

With this in mind, RBJ asked for—and received—in-depth answers to six high-level questions about modern data analytics. What all six Q-A sets have in common is the supplying of a fresh insight or two (or three) into tapping data for its power to prove value and bolster the bottom line. 

1. What new ways of crunching practice data are radiology teams using to add or demonstrate their business value?

For 30-radiologist Advanced Radiology in Connecticut, the list of new ways is long. But at the top is mining the content of radiology reports in real time as a means of tracking compliance with various quality measures, says Gerard Muro, MD, chief medical information officer. Such tracking has fostered Advanced Radiology’s success with its participation in CMS’s Quality Payment Program, with “significant positive financial implications,” in keeping with the ability to show value, Muro reports.

Ashley Prosper, MD, of UCLA Health, a member of the ACR’s Commission on Informatics, puts the use of data analytics to enhance scheduling efficiencies in the “impactful and intriguing” box. One radiology group with which she’s familiar leverages metadata to determine the duration of MRI exams performed on individual scanners. Time allotted for each study ranges from 45 to 90 minutes, based on the scanner itself (some are more efficient than others) as well as on technologist-related variables and specific exam requirements. “It’s a brilliant idea, because it eliminates dead time on scanners as well as long waits for patients,” Prosper tells RBJ

Safwan Halabi, MD, medical director of radiology informatics at Stanford Children’s Health/Lucile Packard Children’s Hospital in Stanford, Calif., singles out a similar emerging application of data a nalytics in evaluating “modality throughput” and equipment utilization data. Here the goal is “having in place the right number of radiologists at a certain time, for a certain patient population,” as well as the right number of ancillary staff.

Harnessing data analytics in this way can boost business performance in two ways, Halabi says. “It facilitates the performance of high-quality studies, including subspecialty studies, in a short time—which encourages referrals” and indirectly feeds the bottom line, he explains. “Additionally, we’re seeing a lot of insurers and payers pushing patients to imaging centers where they can get the best value-add.” Better scheduling and staffing provide just that, Halabi points out.

Equally worth noting, according to Prosper, is the application of data analytics to assess the impact of “wait days” on missed outpatient MRI appointments across varying demographic and socioeconomic groups. Harvard researchers describe this process in a study published in the May 2018 edition of JACR, “Impact of Delayed Time to Advanced Imaging on Missed Appointments Across Different Demographic and Socioeconomic Factors.” The retrospective study involved 42,727 adult patients who were scheduled for an MRI exam over a 12-month period. “Imaging missed appointments” were defined as “missed scheduled imaging encounters,” while wait days were defined as the number of days from study order to appointment. Stratified by race, income and type of insurance, data were analyzed to assess the relationship between missed appointment rates and wait days. 

The authors concluded that increased wait days for advanced imaging significantly bolsters the likelihood of missed appointments, especially among underrepresented minorities and patients with lower socioeconomic status. 

“Analyses of this type,” Prosper says, “can offer a lot of insight into the changes we might make to increase the value of care provided to patients.”

2. What clinical-performance measures have “crossover” ramifications for business performance—yet still fly under the radar of many radiology practices?

Prosper puts the quality of radiologists’ reports in this category, because the higher the caliber of the radiology report, the easier it is and will be for radiologists to demonstrate their worth under the umbrella of value-based care. “Traditionally, we have paid a lot of attention to RVUs and the quantity of radiologists’ reports but not the quality of those reports and how we have followed up on them,” she says. Prosper advises keeping track of whether all findings are being reported to referring physicians and other appropriate individuals—and whether referrers are subsequently contacted to see if they’re following radiologists’ recommendations.

Clinical performance metrics that merit tracking include diagnostic accuracy—and that holds true even though such accuracy remains difficult to track because there’s no universally adaptable infrastructure for doing so. So maintains Andrew Rosenkrantz, MD, of NYU Langone Medical Center in New York City. 

“Using metrics, we look at process and structure but not accuracy—which is just as important in the context of value-add,” Rosenkrantz says before emphasizing that he and his NYU colleagues are working hard to come up with a way to reverse that tide. 

Meanwhile, in Muro’s view, the metric of radiologist efficiency is a low-hanging fruit that is perennially ripe for the picking. His rationale: Different radiologists have different strengths and skillsets. Analyzing individual strengths to customize workflow has the potential to not only improve radiologists’ efficiency, Muro says, but also enhance the quality of their work as well as their overall satisfaction.

As for clinical performance metrics that impact business performance, Muro notes that various types of imaging studies require the radiologist to comment on very specific items, such as references to relevant findings in clinical trials for billing purposes. In Connecticut, Advanced Radiology has the ability to analyze the content of the reports to ensure that such elements are present, eliminating billing delays.

3. How can radiology practices and departments fold data analytics into their daily operations without hindering workflow?

Giving careful thought to the impact of how data is collected, as well as how data and results will be provided to the end-user, is a start.  “If collecting and accessing the data is going to involve multiple steps, the end result will not be pretty, no matter what type of data you’re talking about,” Prosper says. 

To a certain extent, workflow complications that result from making data analytics a part of daily operations can be avoided by seeking vendors’ assistance. For example, Prosper points out, PACS suppliers might share information about which “plug-in” analytics solutions suit their systems, saving end users a lot of time and preventing them from implementing tools that interfere with workflow and don’t make up for the snag by adding value.

“It’s particularly helpful to have excellent working relationships with vendors that understand a client’s need to leverage data and are supportive of such needs,” Muro notes.

Shrestha, who has served as chair of RSNA’s informatics scientific program committee, agrees. “The last thing radiologists need is another app or dashboard to look at,” he says, as they attempt to garner insights from data while simultaneously moving through their reading lists. Instead, he says, much integration of analytics should occur automatically, at the back end. This setup allows high-acuity studies to be prioritized on worklists while also enabling the delivery of analytics-derived insights to the place where care decisions are being made. It also can ensure this info gets sent to other members of the care team, such as nurses and specialist and referring physicians, without delay.  

“Data insights need to be embedded into the fabric of the workflow throughout the value chain, not left as more silos in a proliferation of silos,” says Shrestha. 

Similarly, Muro advocates inte-gration of data analytics applications into existing applications, such as RISes, image viewers and reporting systems. 

4. How can machine learning help radiology practices maximize actionable intelligence gleaned from performance data?

Using AI to augment data analytics-based decision-making capabilities is one way, according to Shrestha. “Artificial intelligence—or more accurately, as I like to call it, augmented intelligence—will not only be a diagnostic tool,” he says. “It will be embedded in the workflow and will,” in turn, “influence the way data analytics are used to make decisions.”

Advanced Radiology, for its part, has initiated a program wherein it’s trialing several different clinically oriented AI applications its principals believe have the potential to improve quality and/or efficiency but come at a cost—making it imperative that their value come under the analytics microscope. For example, one AI application helps the practice’s radiologists identify and report lung nodules.

“Superficially, it seems like a great idea,” Muro states. “However, we need to quantify the true value of such a program. What are the added storage requirements? What are the added responsibilities on the technologists? Is the radiologists’ workflow hindered or improved, and how does that compare to the overall benefits of the application in terms of quality, cost, and efficiency?”

There will soon be numerous such AI applications to consider, Muro points out. “The practices that prosper in the new world of AI,” he underscores, “will be able to leverage data to quickly and accurately assess the true value of these solutions.”

5. Must radiology departments and practices appoint someone to spearhead data-analytics efforts—and, if so, should this person be a radiologist, an administrator or an imaging informaticist?

One word sums up the answer to the first part of this question—and that is, “absolutely.” Otherwise, sources say, efforts to best leverage data analytics will fall by the wayside or, at the very least, fall further down the priority list than they ought to. 

“Strong leadership is very important,” Muro states. “A successful informatics program must effectively communicate goals, needs and expectations among clinical, IT and administrative stakeholders.”

Whether the leader is an administrator, a radiologist or an informaticist doesn’t matter, providing he or she has combined clinical, business and informatics acumen, Muro says. Shrestha expresses a similar view, describing the optimal data-analytics leader as an “alchemist—someone who can hone the right [data] ingredients in the context of algorithms, clinical tasks, imaging center imperatives and healthcare systems as a whole.” He believes such an individual must also have a knack for numbers and an eye for spotting correlations between different data points. 

“The alchemist/data analytics leader could be—but doesn’t necessarily have to be—a radiologist with an informatics background,” Shrestha says. “But the informatics background part is important.”

While the ideal is to identify a single individual to take the lead on data analytics, some practices and departments will have neither that person on board nor the resources to make a new hire. In such cases, the experts agree, group members and staff can put their heads together for a “leadership by team” approach. The prerequisite: All three disciplines—clinical, business and informatics—must be represented in the mix.

6. More than a few radiology departments and practices have yet to embrace analytics, except to the minimal degree needed to meet compliance mandates. What can such groups do right now to more fully get in the game?

Better late than never, and slow and steady wins the race. Both those bromides apply here, the experts seem to agree. Halabi recommends first devising a strategy and a framework. “Pick a metric, or a set of metrics, and create governance around them,” he says. “Know their purpose. It’s much easier come up with ways to extract and analyze data when the reason it’s being done is clear and you know how the data will be acted upon.”

Concentrating on one or two areas first will bode better for imaging providers of all types than adopting an across-the-board approach. Small, relatively easy projects involving just one or two metrics can serve as the beginning of larger, more complete analytics platforms, sources agree.

“It may be tempting to just dive right in, but there are going to be hiccups at first,” Prosper states. “Handling them will be easier when projects are broken into smaller pieces.”

Meanwhile, there’s no time like the present to build on whatever momentum you’ve got, be it meager or massive, in 2019. 

“We’re continually building our data analytics infrastructure to better collect, organize, and analyze data,” Muro says. “Step by step, we keep producing more and more meaningful and actionable results.”

Julie Ritzer Ross contributed to this article. 

Trimed Popup
Trimed Popup