The Quality Challenge
Defining quality in radiology seems simple: It’s an accurate diagnosis or interpretation, provided in a timely manner, in a clear and easy-to-follow report. The devil is always in the details, though. How do you measure accuracy? Are you consistently measuring your patient’s or referring physician’s experience? What’s a good baseline? How are payors defining quality, and more important, how will they structure reimbursement plans around it? What resources will the practice have to devote to measuring quality and the revised processes that come out of it? How do you determine the return on investment in quality? Our practice is one of the many radiology practices facing these challenges. Charlotte Radiology, PA, is a private radiology practice in Charlotte, North Carolina, comprising more than 80 physicians. We, like many radiology practices, recognized the importance of quality long ago, but are now taking the necessary steps to define, measure, and communicate quality to our key audiences. Arl Van Moore Jr, MD, president of Charlotte Radiology, says, “We have always felt that we provided a valuable, high-quality service. Now, we have to prove it.” Charlotte Radiology started its quality initiative 10 years ago, by creating a quality committee that broke down the topic into several critical areas; it is working to set benchmarks and strategies for measuring our future success. The quality committee is composed of a few key staff members and radiologists representing different imaging modalities. The committee’s job is to ensure that all areas affecting quality outcomes are addressed. Linda Cox, director of quality improvement and risk management for Charlotte Radiology, has been with the practice for 13 years and has worked in quality management for almost 20 years. She says, “Anyone hoping for an easy solution is in for a surprise. The more we looked at how quality is affected by various parts of our practice, the more we realized it’s a part of everyone’s job. We are working with all departments, in one way or another, as that relates to quality. Once we have thoroughly outlined how we are addressing, reporting, and measuring quality, we’ll develop a communication strategy to share our benchmarks and successes with our key customers.” Assessing Clinical Quality Charlotte Radiology’s physicians have a strong commitment to ensuring the accuracy of their interpretations. They embraced subspecialty radiology early on, allowing for a more focused approach to care. In addition, our technologists are certified in their imaging modalities. “Even with a great team, though, we know we aren’t immune to errors,” Cox says. “We have always looked at the accuracy of reports and analyzed where we can improve, but today, we have multiple tools at our fingertips to help us measure our success and benchmark ourselves against others.” One of those tools is the ACR® RADPEER™ program. ACR interpretation guidelines state that comparison with older studies should always be done when reading an examination. Use of the ACR RADPEER program allows comparative peer review to be done on the older examination with a current study. The results are sent to the ACR for benchmarking and comparison purposes. Charlotte Radiology has actively endorsed this peer-review program for more than three years. “Currently, we use the data to look for areas and physicians needing improvement,” Cox says. “Sometimes, however, you find that what’s broken isn’t the physician, but rather, a process. RADPEER has helped us to identify several different departmental processes that needed revision, as well as to note an occasional system-hardware issue that needed adjustment.” One process-improvement opportunity that the practice recognized was the need for the radiologist to identify situations readily in which timely communication of an unexpected (but not critical) finding could alter the patient’s surgical or medical management. We made patients’ histories more accessible to the radiologists while they are reading studies, and then we set up a process for notifying referring physicians, by phone, of unexpected findings. “Identifying such problems and developing appropriate solutions have helped improve the timeliness and efficiency of our radiologist workflow, thus improving service to clinicians and care rendered to patients,” Cox explains. Charlotte Radiology quickly realized the benefits of RADPEER, but had to find a solution to support its infrastructure demands. James Oliver, MD, body radiologist and quality-committee chair for Charlotte Radiology, says, “We considered our options and installed a software program, designed to measure and benchmark radiologists’ work, that also met the requirements for RADPEER.” Charlotte Radiology voluntarily participated in the ACR’s General Radiology Improvement Database (GRID) program as a pilot group in 2008. The GRID program sets quality benchmarks by collecting key performance indicator (KPI) information from groups across the country. Some examples of collected KPIs are patients’ waiting times by modality, total exam-turnaround times, and time needed for dictated reports to be completed. Other GRID data look at various outcome measures, such as lung-biopsy complications, contrast extravasations, deaths in a radiology department, and patient falls, to name a few. “Participation in this program added manpower hours to our staff time, as some of the required data must be collected from a manual process,” Cox notes, “but Charlotte Radiology feels it is worth the resources because we recognize the need for uniform data-collection efforts across radiology groups to achieve data results that can be comparable. The data collected, we hope, will assist radiology groups in establishing reasonable and acceptable benchmarks for their own practices.” Business-assessment tools such as KPIs are not altogether new to health care; they were adopted as more hospitals and health systems began approaching their industry in a more businesslike style. Private physician groups such as Charlotte Radiology, however, are just starting to look at these tools as ways to help them define and measure their quality indicators. “KPIs were a somewhat new approach for us,” Oliver comments. “We have always paid attention to them in one way or another, but this was a different, more strategic approach than we were used to taking.” The Quality Crucible While software programs can tabulate data on outcomes and KPIs, one quality indicator that still calls for a personal approach is customer service. Radiology practices used to compete based on technology. Today, competition is service driven, and service is a major part of quality. Our marketing and practice-relations team at Charlotte Radiology addresses customer service alongside the operations and clinical teams—daily. As Charlotte Radiology has nearly 350 employees, the practice’s leaders recognize that they have regular opportunities to make or break customers’ expectations. Addressing the most obvious customers first, Charlotte Radiology uses the products of an external company to measure patient satisfaction. Patients at all of our freestanding centers are invited to participate in an emailed survey. Currently, we receive above a 90% satisfaction rating on a regular basis and a 30% to 40% response rate (which is higher than the typical survey-response rate of 5% to 10%). To achieve such a high response rate and consistently positive ratings, we brought the operations team into the process and removed the marketing team from it. All too often, responsibility for patient satisfaction falls on the marketing team, but in reality, it’s the operations team that makes patient satisfaction happen. Its members are the ones working with patients in clinical settings; they are the ones who can make the difference. The operations team took over the survey process and patient follow-up; the clinical teams not only embraced the process, but found new ways to improve the care that it offers. Mark Farmer, director of operations, says, “Because we are getting regular feedback, our teams are consistently assessing results and finding opportunities to improve. We involved employees from each of our sites, and they are using the surveys as a way to take ownership of their centers; because they want the feedback, they are encouraging patients to fill out the surveys. It’s been a great approach for us.” Marketing is still involved in the process, but in the area where it should be: communicating our positive scores to the public. A more challenging customer to survey is the referring physician, and this is an objective that the practice is approaching one step at a time. We perform our own surveys and assess the data collected by our hospital’s physician surveys as well, but Charlotte Radiology’s challenges are reflected throughout the industry: Rapid growth within our own practice and the growth of the referring community, in the past few years, have had an impact on our referring-physician relationships. The practice’s physicians don’t know the referring physicians as well as they did 15—or even 5—years ago. We are aware that satisfaction is more than getting a fast report; it’s having access to a trusted source when you have a question about that report. Currently, we are working with a practice-relations committee that consists of marketing and public relations, along with a handful of key radiologists. The committee is developing a service-first initiative both to measure and to improve referring-physician satisfaction and relationships. We address everything from broken processes for call-back reports to helping radiologists end phone calls in a more positive manner. Some of our efforts seem so simple, but often, those are the ones that can make the biggest impact on satisfaction scores. Imaging Safety Nothing is more important than ensuring a patient’s safety. ACR accreditations, Mammography Quality Standards Act compliance, FDA certifications, and meeting center-of-excellence standards are just a few of the extra steps taken in the past by Charlotte Radiology and other radiology practices and hospitals. Today, with radiology under the magnifying glass, practices are taking safety to new levels. Doug Sheafor, MD, is a body radiologist and is one of several radiologists serving on Charlotte Radiology’s radiation-safety committee. He says, “A good outcome of the radiation-exposure stories in the news is that both physicians and patients are starting to ask good questions. Even before the radiation scare, our practice engaged in low-dose imaging protocols and worked with area physicians to provide ordering guidelines.” From participating in the Image Gently campaign to observing protocols to keep dose as low as reasonably achievable to using breast shields, Charlotte Radiology’s physicians are looking for ways to protect their patients from unnecessary radiation exposure. Limiting radiation can be challenging, however, and it sometimes requires more effort from the radiologist when reading the study. “We’ve carefully managed the protocol changes to ensure image quality was maintained; it’s a delicate balance,” Sheafor says. Charlotte Radiology works closely with its key specialty referring physicians to ensure their satisfaction with the revised studies. It also is exploring a CME lunch-and-learn program for local referring physicians’ offices that are getting some questions from their patients about radiation exposure. As awareness in the community increased, we took a more proactive approach to communicating about our radiation-safety efforts; we want to be seen as the experts on this topic. We have used resources provided by the ACR to help educate our community’s physicians and patients about safe imaging. We developed fliers for physicians’ offices and patients outlining what steps we have taken and the ACR’s list of questions to ask before being scanned. In addition, we developed a section on the topic for our website (www.cr-radiationsafety.com). Limiting unnecessary radiation extends to ensuring the appropriateness of orders. Charlotte Radiology works with referring offices to ensure that appropriate studies are ordered. “We have a preservices team that checks orders to ensure preauthorizations are in place and accurate,” Farmer says. “In addition, our technologists review orders prior to scanning to ensure the right study has been ordered based on the patients’ diagnosis codes. If there is an error, they work with our radiologists and the referring physician to get the order revised.” Preventive Care The benefits of preventive medicine are widely acknowledged, and health-care reform has mandated it as a part of providing quality care. Primary-care physicians will be tasked with ensuring that their patients are following preventive-care protocols, including obtaining radiology exams such as screening mammography. With 12 breast-imaging centers, Charlotte Radiology has placed a major emphasis on mammography for years. Major educational advertising campaigns, comprehensive reminder programs, and community- and physician-outreach resources are a few of the key elements that have helped Charlotte Radiology maintain one of the largest screening-mammography programs in the country. The added focus from health-care reform has prompted our practice to look at how we can increase our compliance and provide better outcomes data to our referring-physician community. “Our practice is targeting noncompliant patients and assessing better ways to track and report patient-compliance data to our referring physicians,” Farmer says.
The lobby of the Charlotte Breast Center—University, Charlotte, North Carolina.Five years ago, it was a major competitive advantage if you could turn around a report in 24 hours. Today, the expectation is that it will be ready in less than two hours. Turnaround times, easy-to-follow reports, results-delivery methods, and satisfaction are all factoring into the quality of radiology reporting. Voice recognition, RIS, and PACS have all improved radiology reporting. Practices can track their turnaround times and results in their RIS, and can even report turnaround-time data to referring offices as a value-added service. Satisfying different customer segments is where things get tricky. Mike Sanchez, director of IT, explains, “Some of our referring physicians want their studies via fax, others want them online, and still others want them to appear in their electronic medical record (EMR) automatically.” As more and more practices go paperless and health-care records go online, however, radiology practices will rely on their IT teams to ensure that a solid infrastructure is in place to handle the HL7 connections needed to link them to multiple EMRs. “With a group our size, and serving so many locations, development of our IT infrastructure is a slower process,” Sanchez says. “We have made major headway in the past 18 months, but the challenge is that no matter how much progress you make, the technology is a moving target, full of improvements and new applications; it’s a constant effort to keep up with it.” Pay for Performance In 2007, Medicare took a positive step toward ensuring that patients receive high-quality care from their health-care providers. The Physician Quality Reporting Initiative (PQRI) established a financial incentive for eligible health-care professionals to participate in a voluntary quality-reporting program. PQRI reporting allows for a 2% bonus in Medicare reimbursements for those studies that qualify for the measures and meet dictation guidelines in 2010. The bonus is scheduled to decrease to 1.5% in 2011, however, and in each following year—until 2014, when reporting becomes mandatory for full Medicare reimbursement. Larry McIntyre, billing manager, says, “Relatively speaking, the total potential bonus does not account for a large total dollar amount. For Charlotte Radiology, however, the larger goal is to refine the billing/dictation processes proactively over the next few years to enable us to be prepared for the future, when failure to report will result in lower Medicare reimbursement. We expect that once Medicare’s PQRI initiative becomes more of a pay-for-performance program, the various managed-care insurance companies will be close behind with their own quality-based reimbursement initiatives.” As they would for any new process, health-care providers (including Charlotte Radiology) have faced, and will continue to face, challenges related to the PQRI program. “While we feel we are currently well positioned to participate in PQRI reporting to maximize positive clinical outcomes, radiologist processes, and current financial-incentive potential, the road to our success has not been without its bumps,” McIntyre reports. Ensuring clinical quality from our physicians was the easy part. Ensuring that more than 80 radiologists all met the necessary dictation requirements for successful PQRI reporting—and then enabling our business office to identify and code appropriately for requirements based on the dictated physician report—were both major challenges that we had to overcome. “Using our current billing system and coding software offered us an advantage, assisting our staff in proper identification of the necessary dictation requirements without letting it become overburdened by the addition of such a large, detailed task to our current daily processes,” McIntyre says. “We have also been successful in working with our radiologists to help standardize our dictation practices, often through the use of dictation macros that maximize their efficiency while still providing all the necessary information required in the dictated report.” The practice understands and supports the need to provide and document high-quality clinical care in today’s health-care environment. The PQRI is only the tip of the iceberg when it comes to quality initiatives set forth by the insurance companies, but Charlotte Radiology has worked hard not only to accommodate the added workload, but to ensure that our radiologists, staff, and administration are all aware of the benefits of participating (and the future risks of not participating) in these kinds of pay-for-performance programs. Packaging Quality While Charlotte Radiology is already sharing some of its quality data with various audiences, from payors to patients to referring physicians, we are aware that a more strategic communication plan is needed. We are looking at how to provide consistent updates and reports to key audiences, and we are assessing which people need what information (and how they should receive it). Payors, for example, might need a regular report outlining key areas that they are measuring for payments, while information provided to patients might come in the form of an advertisement or a website (or might be used more generally as a branding tool). These choices are not simple. How do you move forward—with moving targets in technology, varying demands from providers, and a host of requirements for reimbursement? “We do it as a team,” Moore says. “There is no one person in our practice who can take on all the pieces of this puzzle. It’s a group effort, and it will take all of us to succeed.” Moore’s advice to other groups just jumping into this process is to engage leaders from all parts of your practice, so that you don’t reinvent the wheel. Quality is not something new, but it is something that you now have to measure and communicate. Katie Robbins is marketing and public-relations director, Charlotte Radiology, PA, Charlotte, North Carolina.
The lobby of the Charlotte Breast Center—University, Charlotte, North Carolina.