As medicine moves from fee-for-service reimbursement into newer models that aim to reward quality of care over quantity, the issue of what quality looks like in radiology and, even more importantly, what it is worth in contract negotiations is becoming a critical one. Hospitals and healthcare systems facing their own issues with quantifying quality of care for newer outcome-based reimbursement models are starting to look for similar data from the groups they contract with.
Yet, even as interest in measurable quality of care grows, it is challenging to make a case that radiology-specific quality measures should be financially rewarded if they are high, or conversely, penalized if they are low. The connection between fast and accurate diagnostic imaging results and superior patient outcomes is just much harder to make than simple primary care correlations between adherence to preventive care and disease management guidelines and overall patient health and lower care costs.
Not surprisingly given this reality, many of the Physician Quality Reporting System (PQRS) data radiology groups or their hospitals must now report to the Centers of Medicare and Medicaid Services (CMS) are measures related to imaging utilization, dose management or reporting processes (see sidebar). The ultimate quality measure—the contribution of the radiology practice, center, or department to the best possible outcome for the patient—is not as easy to define in a measurable way. Some practices, however, are trying to define and demonstrate quality in ways a bit broader than just the PQRS measures, accreditation, or rankings on patient satisfaction surveys (see sidebar).
Changing the conversation
“We do Press Ganey [the patient satisfaction survey], survey our referring physicians, and track key clinical values, but we need to do a better job of defining what quality is in radiology,” says Alicia Vasquez president of California Medical Business Services, LLC, which manages Radiant Imaging and its two divisions, The Hill Medical Corporation in Pasadena and Glendora, California, and Arcadia Radiology Medical Group in Arcadia, California. “[On Press Ganey surveys], patients will rank imaging services on whether the center provided free parking at the facility or how quickly the patient could schedule an appointment, not the quality of the physician who read the study. Was he/she subspecialized? Was he/she fellowship trained? Was it done on a scanner that has met ACR criteria?”
The 26-radiologist practice Radiant Imaging and its affiliates are part of Strategic Radiology, a consortium of radiology practices across the country that collaborate on issues such as quality and share best practices through an Agency for Healthcare Research and Quality-designated patient safety organization (PSO). Vasquez applauds Strategic Radiology for deepening the investigation into what constitutes quality in radiology.
As a Strategic Radiology member practice, Radiant Imaging can address quality in a broader, more meaningful way that encompasses all elements of the practice’s quality profile, including clinical quality, image quality, operational quality, patient experience, physician quality, risk management, and compliance, in partnership with other Strategic members and hospitals. “We have to have a singular voice, because if everyone speaks about quality in a different language, we are not going to get the message across,” Vasquez says. “Strategic Radiology has tried very hard to help define what a highquality radiologist is.”.”
However, as of yet, Radiant Imaging has not had any national payors provide additional reimbursement for providing measurably higher quality imaging services, nor has Radiant Imaging received additional compensation for professional services for meeting performance targets. The flip side of that, of course, is she that Radiant Imaging does not bear any financial risk for not meeting certain quality metrics, but Vasquez knows it is coming.
“[The major national private payors] just don’t have a methodology or mechanism in place to track quality and dovetail that into reimbursement rates,” she says. “The payors have mechanisms to track utilization and other metrics, but they don’t have a manner to track quality and that is unfortunate. Most imaging providers will say the same thing, that they have the best quality. The tough part is to try to measure and articulate that back to the payors.”
What practices are doing
Radiant Imaging is not alone in preparing for a future where quality will figure more concretely in contract negotiations with a direct link to compensation. Another practice that has been active in seeking to define quality in measurable terms, and, yes, be compensated for value-added work such as contributing to physician leadership at a partner hospital or meeting specific performance targets such as turnaround time is Radiology, Inc., in Elkhart, Ind.
“The one thing that is new now is that you are not only going to have to say that you produce quality work, you are going to have to prove it,” Samir B. Patel, MD, of Radiology, Inc., states. “You have to be able to define it and measure it for your clients.”
In an effort to make visible all of the value-added work the radiologists of Radiology, Inc., do for their hospital partners, Patel and his colleagues developed a value-add matrix and quality is one of the matrix’s four major components of value. As they define it, quality includes concepts of quality assurance and quality control. These are grouped in 10 different categories.
- Items related to accreditation
- Adverse event or root cause analysis
- Conference as a chance for multidisciplinary interaction
- Peer review
- Physician quality reporting as a measure of what the practice is are doing to meet national quality reporting standards such as PQRS or items related to demonstrating “Meaningful Use” of electronic health record (EHR) technology
- Protocol management to ensure quality protocols based on national standards such as the ACR practice guidelines
- Radiation dose management
- Radiology-pathology correlation
- Structured reporting that incorporate national standards such as PQRS measures and ACR guidelines
- Technologist and staff feedback
According to Patel, the concept of quality has had different meanings in the literature, but their clients are now looking for radiology groups that can attach concrete numbers to the idea of quality. For example, when negotiating with hospital administrators, Radiology, Inc., can now say that in 2013, across all the sites that it services, the practice was able to earn 104 national accreditations and designations from outside evaluators for doing quality work. It also participated in five CMS PQRS initiatives, work not all physicians do. Its radiologists had 1,900 meaningful use-documented face-to-face patient interactions, defying the stereotype that radiology is just a commodity and radiologists don’t deliver direct patient care. Furthermore, the practice peer reviewed more 5,600 exams and in over 99% of cases the peer reviewer agreed with the original assessment. In addition, in radiation dose reduction, Radiology, Inc., could show that just in one initiative for CT lung biopsy, they reduced the radiation dose by 63%.
Hospitals have an open ear
While payors may not be ready to pay more for demonstrably higher quality, putting numbers on quality the way Radiology, Inc., is doing is becoming more important in maintaining a secure relationship with hospital partners, Thomas W. Greeson, JD, a partner at law firm ReedSmith’s Falls Church, Virginia, office, agrees. Greeson helps many practices negotiate contracts and he believes that quantifying quality is a way to speak the hospital administrator’s language.
“Hospitals are increasingly looking to contract with radiology groups that can demonstrate that they bring added value to the business relationship, and quality can be a centerpiece of these types of discussions,” Greeson says.
Lisa Mead, RN, MS, CPHQ, Strategic Radiology’s director of Quality and Patient Safety, agrees. “Hospital executives often are removed from the radiology relationship until contract negotiation time, and they rely on the radiology hospital director’s perspective of the ‘quality’ of the radiology services,” she wrote via email. “In most cases, it is the perception of the latest interaction with a radiologist that is top of mind. Radiology practices have an opportunity to keep the attention of the hospital executives by developing a relationship through hospital committee participation and through presentations at medical executive or board meetings to bring attention to the quality care and services provided through the radiology contract.”
To go above and beyond the historic measures of quality in radiology (turnaround time, credentialing and coverage), Mead recommends groups include data around the following metrics:
- sub-specialist availability;
- peer review activities;
- MQSA data;
- patient satisfaction survey results;
- 360 feedback programs;
- referring physician satisfaction,survey results;
- participation in committees;
- process improvement project results that focus on efficiency and effectiveness; and
- coordination efforts with the hospital director on any hospital specific program requirements.
However, quality as it is defined in a contract like a professional services agreement is generally limited to concrete deliverables. “Usually, it is in the form of agreed upon performance measures, and the classic performance measure is usually turnaround time and assuring that radiologists are available on site and on call at times that the hospital and the group determine to be appropriate,” Greeson says. “But they also look at how the appropriate stakeholders perceive the quality of the radiology service, so they will agree to regular ongoing performance measures by use of surveys such as patient-satisfaction and physician-satisfaction surveys.”
Baking the other quality measures that may impress a hospital administrator into a contract is tricky, Greeson says, because groups need to be careful that what they agree to deliver is indeed something they have control over. For example, when it comes to turnaround times, unless the measure looks at what the turnaround time is once the radiologists have the opportunity to review the imaging data that is before them and provide a high-quality interpretation service, then the group may be unable to meet that measure due to factors the hospital has control over, such as when they make the data available on the PACS, if they can make available prior studies, and if they can assure that the system will not be down during part of the turnaround time window.
Patel and Radiology, Inc., have an agreement with one hospital client where Radiology, Inc., may earn extra non-RVU-linked income by meeting certain turnaround time targets, and they deal with the technology issue by building a partnership into their professional-service agreement. For example, the hospital will agree that the technology will be paid for and always have uptime with troubleshooting IT staff available to Radiology, Inc.
Likewise, Radiology, Inc., cannot control what ordering physicians do, but they can agree to track what was ordered and do analysis based on ACR appropriateness criteria to see how many of a particular type of study was ordered appropriately and then provide that data back to the hospital. “We present that information to the hospital and say, ‘Look, while we can’t ratchet this down, we can tell you that what they are ordering is appropriate,’” Patel explains.
In one recent case, Radiology Inc. worked with a hospital partner that was concerned its hospitalists were ordering too many inpatient MRIs, and they wanted to reduce that number. However, in comparing the orders to the ACR appropriateness criteria, it turned out that 100% of the studies ordered met the appropriateness criteria and furthermore, there were acute or actionable findings in half of the studies that were important in the patient’s management.
“[The hospital administrators] now understand that the radiologists are taking an active approach toward utilization management and making sure the studies that are ordered are appropriate, and if there is another less costly study that would be appropriate, we can divert that study,” Patel says.
Paying for professionalism
William Keyes, MD, a neuroradiologist at Inland Imaging in Spokane, Washington, and medical director of quality and safety for Strategic Radiology, believes this aspect of quality—whereby groups become true partners with hospitals and payors in making sure patients receive the right scan at the right time for the right reasons—may ultimately be how quality is defined in contracting terms. Hospitals are being challenged to control care costs while simultaneously improving care and they need the groups they contract with to help them do this because radiologists are critical to the success of electronic decision-support and utilization control.
“This is a big change because in the past radiology was more interested in increasing, not decreasing, volume,” Keyes says.
This cost containment portion of quality radiology service is something Keyes and Mead think someday may be paid for just as other pay-for-performance measures are once payors can appraise the value of this service from the standpoint of contracting. “From a payor’s perspective, while it is critical to bring up quality and value in the contracting process, we still have to build this awareness and show economic savings to capture this audience’s attention,” Mead says.
The current healthcare delivery system in the United States still works on a fee-for-service basis in nearly all scenarios and is only beginning to switch to paying for value and cost-control efforts, adds Greeson. “We are still in a fee-for-service environment with Medicare and third-party payors,” he explains. “When we move to a more bundled payment system for providing services, there may indeed be a new emphasis placed on making sure that those providing services as part of that bundled product are achieving quality goals.”
However, the other and perhaps equally critical measure of quality (besides cost containment) is professional quality, and that is not something that can as easily be paired with a direct value.
Currently, Keyes was not aware of too many radiology practices getting paid extra for the hours they devote to educating referring physicians and serving on hospital committees and tumor boards, nor did he think they should be. “That is the responsibility of the radiologist as a physician,” he says.
On the other hand, with continued downward pressure on radiologist RVU-linked income, the need to optimize non-RVU-linked income is very real. Patel and Radiology, Inc., have begun earning incentive payments for medical directorship agreements that are tied to meeting a certain minimum number or hours of service, broken down by line items, such as hours spent on conference, hours spent on peer review and hours spent on committee work. Radiology, Inc., also earns incentive payments from one hospital for controlling recommendations for additional exams, Patel says.
“In 2013, we did over 9,000 hours of work across our sites that does not have an RVU associated with it, so I certainly hope there will someday be more direct compensation for these value-added activities,” Patel says.
Quality and radiologist compensation
Although Radiology Inc. tracks hours devoted by its radiologists to value-added activities, including all that goes into providing quality service for their contracting partners as well as conducting comprehensive internal performance reviews, so far partner compensation has not been linked to specific quality metrics or hours of value-added service, Patel says. Part of the reason for this is that the value-added matrix is still very new. The other part of the reason is that the group is finding that something as blunt as linking compensation to performance is not really needed when it comes to highly trained professionals like radiologists.
When the matrix was rolled out internally, it was done in a very transparent fashion, and all radiologists were able to see in twice-yearly reports what their colleagues were contributing in terms of hours of value-added service. This may have led to some natural competitive instincts kicking in: For the first six months of 2014, the average amount of time contributed by the radiologists increased 18% when compared to the same time period during the previous year, Patel says.
“Unless you are actually measuring how much time people are investing, it is difficult to ask people to give more,” Patel says. He adds that although he knows of some groups that do link income to quality, it is not something Radiology, Inc., has needed to do.
Keyes says he has not seen any practices that are linking individual radiologist compensation to quality metrics, yet. “That is not to say peer-review accuracy and performance of the radiologist is ignored by any means,” he explains. “It just has not been fully developed to the point of taking action based on the metrics, and that is only fair because you need to develop the metrics from an acceptable standpoint within the radiology group before you take action based on those metrics.”
Vasquez points out that quality is already an intense component of radiologist performance reviews, but determining a fair way to tie compensation to quality is not easy because there is not a single metric that can be held up as the absolute end-all definition of quality. Even peer review and accuracy has its limitations.
“Quality is more than getting the chest X-ray right,” she says. “It is clinical competency plus keeping their CMEs up; customer service to the staff, referring physicians and patients; practice building; and all of these things that make for a great well-rounded radiologist.”
The rub with quality in contracting
Some radiology services undoubtedly are in a better position to deliver on quality commitments than others. “We are not afraid of having to measure quality, because we think we can,” Vasquez says. At the same time, she and Keyes are both astutely aware of how quality as a component in contracting can change the quality conversation from one of industry-wide collaboration to drive improved outcomes to a competition in which there may be winners and losers. One issue they both identified is how quality as it is currently defined could pose a challenge to smaller groups without the resources to offer subspecialty coverage, medical staff participation, utilization management, and all of the other things hospitals—and perhaps someday payors—will look for from radiology practices.
“There are some small community hospitals where having a general radiologist is fine,” Vasquez says. “It would be wonderful if every patient could have subspecialty reads becauseit may lead to better patient outcomes, but at the same time, a small or rural hospital may not require onsite subspecialized care the same way a Level II trauma center that routinely images a large amount of trauma cases does.”
Greeson says he sees this in the contract negotiations he is part of and it is, in his view, one of the many factors pushing the trend of smaller practices either merging with larger groups or joining together in other, looser collaborative arrangements that allow sharing of data and pooling of resources.
“Specialists in the hospitals are telling their hospital administrators that they want access to subspecialty radiologists to make the service to their patients betters,” Greeson says. “That is a quality piece that is easier for some groups to deliver than others.”
In addition, Greeson says, there are some basic legal concerns with quality entering the contracting discussion. For example, if presenting peer-review data, one needs to do it in such a way that it will not invite malpractice scrutiny in the situations where the peer reviewer does not agree with the original assessment. That entails big data sets where it becomes impossible to tease out individual cases in which a radiologist’s original assessment may have been wrong. Bigger practices and teleradiology groups naturally have bigger data, so that may be another competitive advantage linked to size.
A related concern is that quality claims brought into contracting in a competitive marketplace cannot be of a nature where they could rise to the legal standard of libeling, slandering, or otherwise interfering with another practice in that same marketplace. Despite the cautions, however, Greeson thinks having quality measures be a part of contracting is a good trend overall because it corrects the idea of radiology as a commodity. “I think it is a good thing that radiology groups understand that they need to be able to demonstrate their value,” he says.
“Looking at yourself in the mirror is healthy. It drives you to be better,” Vasquez adds. “The problem is that not all groups are as engaged to do that. As more and more groups begin the quality conversation, it will raise the level of the water for everyone.”