Standardization in radiology can take a dozen different paths, and it is clearly complex—but why is there a need for standardization in the first place? Debra L. Monticciolo, MD, FACR, is vice chair for research at Scott & White Healthcare (Temple, Texas), a nonprofit health system. She is a professor of radiology at the allied Texas A&M Health Science Center College of Medicine and is a subspecialist in mammography. Monticciolo is chair of the ACR® Commission on Quality and Safety. Of course, quality and safety are among the primary reasons that standardization is a talking point for so many who hold stakes in radiology’s future. “You have to go back to the way medical practice developed—in this country and elsewhere,” Monticciolo says. “Physicians go out and practice. They respond to their local environments and their patients’ needs. They develop their own ways of doing things. It’s not that there’s anything wrong with that, but these variations in care, over time, can be amplified when you get to taking care of large numbers of patients systematically. You don’t want things being done in 20 different ways if they can be done in two different ways.” From the standpoints of both safety and quality of care, patients are best served when practice standards are developed and used by all providers, Monticciolo says. She uses an example from the ACR’s effort to develop its Dose Index Registry to protect patients from excess radiation during one or multiple imaging exams. “When we first looked at something as simple as a CT exam of the head with contrast, in the systems that we looked at—at the hospitals that were initially working on this—we discovered that this single exam was being named in 1,200 different ways,” she says. “You can’t do anything if you can’t gather data. There needed to be one name for the procedure; then, we could look at how we can do things better.” Curtis Langlotz, MD, PhD, a professor of radiology and vice chair for informatics in the radiology department at the University of Pennsylvania Health System in Philadelphia, has spent years developing a common nomenclature for imaging procedures and processes. He says, “There is a growing consensus across health-care disciplines—not just in radiology—that variability in care is undesirable. We can’t always agree on the optimal level of care, but we can often agree that care at the extremes is suboptimal.” Mitchell D. Schnall, MD, PhD, is a professor of radiology and, since October 2012, chair of the radiology department at the University of Pennsylvania Health System. Schnall also chairs the ACR Imaging Network (ACRIN). He is personally working on quantitative biomarker standardization in clinical trials involving cancer patients, driving home the point that it is those who order imaging exams who are impelling radiology toward standardization. “Other people are asking us for standardization. Our users are starting to ask for it. The oncologists like to see standard and quantitative reports. This is what the customer base is asking for,” Schnall says. Digital technology plays a part, too. Convergence Anthony A. Mancuso, MD, is professor and radiology department chair at the University of Florida College of Medicine in Gainesville. Mancuso says that timing has a lot to do with what radiology is undergoing. Standardization represents a convergence of accumulated radiological knowledge and applications made possible by IT and digital communication. “We know what we don’t know,” Mancuso says. Exams that work well for given clinical scenarios are well documented. Digital technology makes standardized protocols and procedures more efficient to implement. Speech-recognition transcription systems that build structure into the exam-reporting process will soon allow data to be mined from imaging exams as never before, Mancuso adds. “There are tremendous systems out there. You can now embed some principles, in creating a proper reporting structure, that make it very efficient,” he says. In May, the US DHHS announced that a tipping point has been reached: Physicians and health-care systems have responded to federal financial incentives to create electronic health records (EHRs). By the end of 2013, the DHHS estimates, more than half of physicians and 80% of hospitals will be using EHRs. In 2008, prior to the inclusion of financial incentives in the American Recovery and Reinvestment Act of 2009, only 17% of physicians and 9% of hospitals were using EHRs.¹ What to Standardize The digital medical evolution and the focus on accountable care as an alternative to fee-for-service reimbursement both fit into the efficiency and quality benefits that standardization promises. Despite this, standardizing radiology isn’t going to be easy. It will take a long time, Schnall says. He’s impatient when people discuss standardization in the same breath as DICOM, the set of communications standards allowing imaging equipment, PACS, and RIS to exchange data in a uniform way. “I get annoyed when I hear people say that we can do this because we did DICOM,” Schnall says. “This is so much more complicated than DICOM that the reference does not do it justice. We are talking about practice standards that are fundamentally different. This is very different from DICOM: It’s much harder to define. DICOM was easy.” Which elements in radiology need to be standardized? “We must standardize everything that affects the outcome for patients,” Schnall says, “from patient preparation to the time between a contrast injection and a scan.” Mancuso agrees; he says that the entire radiology round trip, from order entry to report delivery, needs to be standardized. “Any standard is rule based, so the rules have to be efficient,” Mancuso adds. “Standardization eliminates waste, if it’s done correctly (assuming that it doesn’t add a bureaucratic burden).” Mancuso and his colleagues at the University of Florida College of Medicine have been working on radiology standardization for years. Even so, Mancuso estimates that they are only 50% of the way to where they want to be. They are on the verge of implementing a standardized reporting system, but it isn’t in use yet. They also have no decision-support system in place for referrers, although that’s coming. Standardization is all about refinement and consolidation, Mancuso adds. Nomenclature Perhaps nothing about standardization is more important than making sure that each discrete imaging procedure is called by a single name. The name is used to define consistent, predictable protocols that enhance quality and safety, as well as to allow outcomes to be assessed and compared. Mancuso and his staff have developed standard names for imaging procedures (SNIPs). SNIPs indicate modalities and anatomy, and they are used in ordering, scheduling, performing, interpreting, and billing for exams. SNIPs, Mancuso explains, “became our infrastructure for creating the orderable exams that we do. Then, on top of that, there are the specific indications for the imaging protocols.” The protocols can be seen at protocols.xray.ufl.edu/live_protocols/snips/?elements/page_request/snip/update_select_tree/browse_panel. When the clinician orders, he or she selects a SNIP, which links to the exam and the report template for what the clinician wants to know, Mancuso says. His department has just finished translating the SNIP tables and names into ICD-9 billing codes. “We try to do everything at an expert level—and consistently,” Mancuso says. “Everything can’t be standardized, but we can be sure there are proper pathways for the vast majority of things we do.” RadLex SNIPs form the University of Florida College of Medicine’s localized and adapted version of a broad-based nomenclature effort that is still gathering speed: RadLex, sponsored by the RSNA. The RadLex Playbook, released in 2011, is a standardized lexicon of procedure steps and possible orders covering 342 procedures. For a head CT exam with contrast, the RadLex Playbook has one entry (instead of the 1,200 names that Monticciolo and her colleagues encountered). Of course, RadLex contains many other head-related CT studies that might have been counted in Monticciolo’s original effort, but each is now named, numbered, and described specifically in RadLex. Each is a discrete procedure. Langlotz has been working on RadLex for nearly a decade. Christopher Sistrom, MD, PhD, the physician/analyst at the University of Florida College of Medicine who developed SNIPs “was working in coordination with the RadLex Playbook,” Langlotz says. This is not unusual, he adds. “Each practice will have its own list of procedure codes (sometimes called a charge master). The point is not for each practice to use RadLex Playbook codes out of the box, but to map its charge master to the Playbook codes. That way, we can benchmark between practices using comparable codes. The ACR’s Dose Index Registry uses the RadLex Playbook for just that purpose,” he says. Langlotz notes that RadLex creates a foothold for other standardization efforts, but does so quietly. “RadLex is like plumbing: It must function properly, but if you have to think about it, it’s not doing its job. RadLex serves as an infrastructure and foundation for many other important initiatives, including structured reporting and natural-language understanding. It’s essential, but it’s not something radiologists should need to worry about,” he says. RadLex has already been adopted by institutions, providers, and vendors to a degree that Langlotz did not foresee. He says, “When we conceived of RadLex, almost 10 years ago, I don’t think anyone anticipated the level of adoption we see today.” Structured Reporting Until recently, many radiologists used narrative reporting, but this is changing as more adopt structured reporting. There is resistance, of course. Some radiologists view their narrative reports as an art form—easy for referring clinicians to grasp and quick for the experienced radiologist to produce. For some, structured reporting—which involves narrative, per-exam checklists, and perhaps annotated measurements—is cumbersome and time consuming. Structured reporting has, Schnall says, “sort of a bad name, because the many interfaces are not very efficient for radiologists, and they don’t want to lose efficiency. We are working now on how to develop interfaces that don’t cause loss of time. The templates require lots of clicks. They are not efficient interfaces, but we are starting to see those develop, and we will be able to use them efficiently.” He continues, “Standards are very important, whether there is quantification or not. In general, the output of radiology is not data. We provide unstructured text, and everybody uses his or her own descriptive terms. The idea of standardization (in both quality and quantity) is to create databases to predict outcomes for the next patient. That’s the way radiology is headed.” In breast imaging, for example, use of the ACR BI-RADS® Atlas has standardized terminology. Schnall says, “Everybody knows what a 2 means; everybody knows what a 3 means. I can mine data from a mammography system and know a lot about what they mean for the next patient. I can look at callbacks, and if I buy a mammography machine, I can tell, right away, the effect that it has. Do cancers and biopsies increase? I can’t say that when I buy a new MRI system. I have no standard way to represent that.” RSNA Advocacy The RSNA is promoting structured reporting: Named and numbered exams have a specified reporting protocol that is available at www.radreport.org. “The RSNA’s structured-reporting initiative,” Langlotz says, “gives radiologists the tools they need to begin standardizing their reports. More than 200 standardized report templates are available. The templates are based on the RadLex standard nomenclature. To make it possible for radiologists to download the templates directly into their reporting systems, we have created an Integrating the Healthcare Enterprise technical standard for radiology-report templates.” He continues, “We want to give radiologists the tools to smooth their transition to more standardized reporting. The key is to retain or improve the efficiency of radiologists. Standard reports can automatically incorporate information from the imaging modality. Information such as ultrasound measurement, radiation dose, or contrast data need not be redictated, reducing errors and saving time.” National programs such as meaningful use and the Physician Quality Reporting System have also provided “strong incentives for structured reporting,” Langlotz says. “More and more practices are moving to structured reporting. In our practice, we are standardizing ICU chest reports, reporting of abdominal masses affecting solid organs, and PET/CT studies of cancer. Those practices aggressively moving toward structured reporting are finding technical limitations in the reporting systems they use. One of the goals of the RSNA structured-reporting initiative is to remove those barriers—to create a smooth migration path from narrative reports to structured reports.” Quantitative Imaging At ACRIN, Schnall is investigating quantitative imaging, another element of standardization, by developing biomarkers and imaging indicators to describe and treat cancers. This is being done through clinical trials. “One of the great hopes for imaging, going forward, is that it will be a major source of information and guidance for personal therapy,” Schnall says. “There are a lot of things we need to make quantitative in order to predict the outcome of personalized therapy reliably. There are any number of imaging biomarkers to standardize and assess through multicenter trials.” He adds, “We look at our role as taking promising technology and figuring out how to deploy it across multiple institutions, getting equivalent data, and seeing if that technology does have value in one or another diagnostic setting—taking it all the way from proof in principle to multicenter feasibility studies to clinical validation.” Schnall continues, “We’ve got any number of techniques, at various levels of validation, when it comes to quantitative imaging. The technique that is probably furthest along is a version of dynamic-contrast MRI that we use in breast cancer cases in which the cancer will be taken out, but pretreated with chemotherapy to shrink it first. These are larger breast cancers. We have validated the methodology; we’ve shown that this can be measured consistently. We’ve shown that it can predict the pathology, and we have data that we’ll soon be publishing to show that it may predict long-term patient outcomes.” In the long term, Schnall says, radiologists will convert to structured reporting—and on top of that, to quantitative reporting. Software vendors have offered decision-support systems for use by ordering clinicians for years. As Mancuso reports, vendors haven’t stopped there. They are now offering voice-activated reporting systems that automatically structure radiology reports and that use natural-language identifiers to make mining data from the reports feasible. Natural-language data mining (which allows patient outcomes to be tracked and best-practice imaging tests to be identified and refined further) is possibly the cap piece of standardization, as it is now envisioned. Langlotz says, “No matter how much we focus on standard nomenclature and structured reporting, it’s important to remember that much of the report will remain in narrative form, so access to automated natural-language understanding and text-based search tools is critically important.” Ultimately, Mancuso adds, systems providing everything from decision support to structured final reports will have autocorrect elements. “The next thing will be to develop active management systems,” he says, “to self-correct these things on the fly: to say (to providers or to systems in general), ‘The pathway you’re taking is not efficient, and it is costly. Explain, or go back to this approach that seems to work.’” Adoption by Hospital Systems Hospital radiology administrators are certainly aware of standardization, structured reporting, and decision support. They contend daily not only with trying to make imaging itself more efficient, but with improving their patient-handling processes as well. Intermountain Healthcare (Salt Lake City, Utah) is a nonprofit system of 22 hospitals, a medical group, and nearly 200 physician clinics, spread over Utah and Southern Idaho. Intermountain Healthcare operates its own health-insurance company. David Monaghan, MHA, is vice president for imaging services at Intermountain Healthcare. He says that the system has more than a dozen employed radiologists, but most of the radiologists it relies on, in six different groups, interpret studies under contract. Monaghan estimates that Intermountain Healthcare performs about 1.5 million imaging exams per year. Intermountain Healthcare already has put in place a single PACS and RIS, Monaghan says, and is now standardizing as many elements as it can of what he calls the radiology value chain. The overriding goals are “predictable and consistent outcomes and a consistent service experience,” he says, “along with data capture. We don’t look at imaging as just radiology; we’re looking at the other specialties that acquire images. There are a lot of redundancies.” Structured reporting is on the wish list and is being promoted, he says. “We’ve seen the future, and we like it. That can be extremely difficult to implement, though, when you are involving different radiology groups. Reporting is still considered the art of the individual physician.” To bring along reluctant radiologists, Intermountain Healthcare is focusing on shared accountability, using a guidance-council approach. Monaghan says, “We are moving toward risk-based population care, so standardization is needed: an integrated delivery model with all the radiologists—and the benefits of consistent outcomes.” What physicians find most convincing, he adds, is having data, so Intermountain Healthcare is focusing there to convince radiologists to do things its way. “It’s almost always having data that sells,” he says. For example, Intermountain Healthcare did a children’s hospital study to determine the lowest radiation level for pediatric head CT exams that would result in an image that could be interpreted, Monaghan says. Having found that safety standard, Intermountain Healthcare now requires its use throughout the network. Monaghan adds, “That one doesn’t have an opt-out feature.” Immediately Classic Research Intermountain Healthcare handles equipment service in-house, Monaghan notes. Even so, it is looking at partner vendors, by modality, to assess whether their service offerings could reduce cost and improve efficiency. “We want select partners in select modalities,” Monaghan says. “We look at that as a standardization gain as well.” Intermountain Healthcare recently moved its imaging appointment schedulers into patient account services so that time could be saved by asking patients about clinical care and billing in a single phone interview. Monaghan says, “That all focuses on patient engagement.” Intermountain Healthcare also is standardizing the use of existing technology through its IT support teams. “We have one RIS, but people were using it differently in the way that they would schedule which process to do first,” he explains. “That can result in different outcomes, and the time stamps can be different. The patient might be in the waiting room for an hour, drinking contrast, but that wouldn’t show up; we are trying to get the processes and the technologists in lockstep.” Standardization involves a great deal of what Monaghan calls “immediately classic research: As soon as we do it, and it works, we start to implement it,” he says. There also are the systemic things that radiology can’t control, but among the areas where it would like to see changes, Monaghan says, the first is access to images from the electronic medical record. “That’s where we start to mess with somebody else’s cheese, and radiology isn’t the highest priority,” Monaghan says. The Halo Effect Jim Sapienza, FACHE, MA, MHA, MBA, is systemwide administrator of imaging services for MultiCare Health System (Tacoma, Washington), a not-for-profit provider that operates five hospitals, six emergency departments, and 20 outpatient centers in four Western Washington counties. MultiCare Health System contracts with three radiology groups (about 110 radiologists) to interpret close to a million exams per year, he says. Sapienza defines four areas of standardization that MultiCare Health System is pursuing: workflow, equipment, protocols and processes, and outcomes (or measurements tied to outcomes). “Ultimately, all this work strives to improve quality control in imaging—to raise the bar,” he says. “Higher quality reduces the cost of care.” A major standardization initiative, he says, has been reducing turnaround time in getting radiology reports back to emergency-department physicians. “We wanted leading indicators, rather than lagging indicators,” he says. “We figured that if radiologists set up workflows to focus on emergency-department turnaround time, they would eventually get around to improving all turnaround times.” This turned out to be the case. Emergency-department turnaround time was reduced to around 20 minutes, and other turnaround times shrank as well. “A study might once have had a week or two as its turnaround time, but now, a nonurgent MRI exam has a turnaround time of two hours or less,” Sapienza says. “They set up workflows. The focus was on emergency-department turnaround time, but that had a halo effect on all turnaround times.” Turnaround-time standardization led to workflow, protocol/process, and outcomes improvements, Sapienza notes, so three of his four areas of standardization were improved by attention to one systemic problem. “Lean training says that without standards, there can be no improvement,” he adds. Another step that MultiCare Health System has taken is the use of reporting templates to reduce the need for editor/transcriptionists. “In 2012, one of our six indicators was the number of self-edits,” Sapienza says. “Some radiologists had been self-editing for years (and finalizing reports themselves). One of the things we did to get to 100% self-editing was to clean up the template and make sure it was a structured exam.” He adds, “We were 90% self-edited, and we gave notice that we wanted to be at 100% in 18 months. In seven months, the radiologists went to 99.8%. Now, on average, we might have 10 reports a week that aren’t self-edited. Some ultrasound reports can be daunting because of the data points that need to be entered. They might send those to the editors to make sure that they get checked.” Standardization’s Downside There is some fear that standardization will halt advances in medicine (and radiology, in particular) in their tracks. Innovation has revolutionized radiology; nobody wants to think that imposing standards, as necessary as it might be, will halt that. “We don’t want to standardize everything,” Monticciolo says. “We don’t want to do fast-food service; we don’t want cookie-cutter care, but we do want to standardize. We still need individual care.” Schnall notes that modality vendors have always competed based on innovation and new technology more than on price, and standardization might “remove some flexibility,” he says, making it harder for manufacturers to compete. “We don’t want to take away creativity. We don’t want to inhibit the ability to develop new methods. We don’t want to undermine that,” Schnall says. Mancuso also raises the issue. “Medicine is complex, so we don’t want to eliminate creativity or the exercise of wisdom. We don’t want to remove the human element or judgment entirely from medicine. People should be accountable for their best output, but should not feel stifled,” he says. Mancuso seems confident that standardization will be seen as more innovation, not as an impediment to it. “In the past 10 years, the ability to standardize has really set itself out there,” he says. “Productive human beings want to do better. It clearly is a stimulant.” George Wiley is a contributing writer for Radiology Business Journal.
Devising a Blueprint for Radiology: Standardization