Best Practices: How the RBMs Score

Growth in imaging utilization has led prior authorization (a 1980s health-plan strategy) to be applied to advanced imaging services. RBMs have developed increasingly complex programs to reduce imaging expenses through utilization management, credentialing, channeling to lower-cost providers, and network contracting. Five competitors dominate this marketplace: American Imaging Management (AIM®), Inc (Deerfield, Illinois); CareCore National (Bluffton, South Carolina); HealthHelp® (Houston, Texas); MedSolutions (Franklin, Tennessee); and National Imaging Associates (NIA)/Magellan Health Services, Inc (Columbia, Maryland). By 2009, these companies managed utilization for almost 100 million US covered lives.¹ Today, they hope to expand into the Medicaid and Medicare programs. RBMs have caused radiology to experience declines in volume and increases in administrative costs. In response, the ACR and the RBMA developed the best-practice clinical, administrative, and transparency guidelines for RBMs in 2009 and updated them in 2011.² As much as 28% of savings attributed to RBMs could be shifted to providers,3 and the overall value of RBMs to imaging providers, referrers, and the medical community has been questioned (with speculation that RBMs could increase total health-care costs through denial of appropriate and necessary tests). The intended purposes of the ACR–RBMA guidelines were to provide standards for RBMs and establish a benchmark with which RBMs could be compared. AIM was the first to recognize this potential and benchmark itself against these standards (in 2010). The RBMA then determined that each RBM should have its compliance measured and scored. In order to minimize bias, the RBMA hired Fulcrum Strategies (Raleigh, North Carolina), an independent, third-party, physician-practice consulting company. Methods The RBMA’s Payor Relations Committee (PRC) created a survey questionnaire that incorporated the ACR–RBMA guidelines and was divided into clinical, administrative, and transparency topic sections. Fulcrum Strategies and the PRC then developed a scoring system for the questionnaire, based on feedback from RBMA members and the RBMs. Points were awarded based on the assigned weight of each standard. The weighting was determined by collecting comments from the PRC and the RBMs about the perceived importance of each question (five to 15 points). This produced a scoring matrix with a maximum possible score of 300 points, with partial credit possible for each question. Fulcrum Strategies was engaged to complete the evaluation process in order to avoid any specific provider bias, promote the objectivity of the process, and make sure that one person would be scoring each RBM (to ensure consistency). Fulcrum Strategies sent the RBMs the questionnaire and scoring matrix, requesting that they score themselves. The RBMs were asked to provide additional relevant information, indicating which processes or policies were dictated by payors and what objective, verifiable information could be provided to support their responses. Each RBM filled out the scoring grid and returned it to Fulcrum Strategies, which then called to clarify any unclear responses, seek additional information, and obtain supplemental support. RBMs were awarded the total possible points for questions that fully met the standards and were awarded no points for questions that did not meet the standards at all. For questions where an RBM partially met the standard, partial points were awarded; for example, if an RBM provided information showing that a standard was followed 90% of the time, it would receive 90% of the possible points for that question. Once the process was completed, Fulcrum Strategies assigned a score to each RBM. In order to obtain a balanced representation (given current business exigencies and the regulatory environment), the RBMA then sent the same survey to a broad cross-section of imaging providers, including hospital-based or -owned imaging centers, radiologist-owned imaging centers, and IDTFs. Each provider surveyed operates multiple imaging locations, with a considerable volume of advanced imaging procedures (in a number of imaging modalities) being performed every year. Providers were instructed to score only RBMs with which they had direct interaction. They were also instructed to score only those questions with which they had direct experience. Fulcrum Strategies collected the providers’ responses, analyzed them, and tabulated the results. The scoring for each RBM represents an average of a minimum of five provider responses. Results The table presents the combined RBM self-reported scores and the average provider-reported scores, by RBM, for each of the guidelines. RBMs generally gave themselves higher marks, while providers awarded the RBMs significantly fewer points. RBMs might have provided responses based on their policies (or what should happen), while the providers’ scores might represent reality (or what does happen): Sometimes, policies are not administered as planned. The RBMs gave themselves full credit for 11 of the 19 criteria and partial credit for seven criteria. No RBM, however, gave itself any points for sharing savings with referrers or providers (to offset some of the administrative costs imposed by preauthorization requirements). This is a point of significant divergence between what RBMs are doing and what the ACR and the RBMA believe that the best practice should be. Some RBMs explained that their processes did not put significant administrative burdens or costs on providers or referrers, but each RBM eventually admitted that its processes add at least some additional cost and burden (and stated that those are driven principally by individual payors’ requirements). No RBM is doing anything significant to offset the additional administrative costs and burdens that its programs impose. Provider scoring shows interesting results, compared with RBMs’ self-reported scores, concerning whether imaging providers are allowed to obtain preauthorization. The RBMs indicating that this was allowed were not scored accordingly by providers, while those indicating that it was not allowed received points from the providers as if it were allowed. This could indicate confusion about which RBMs allow this practice (and in what situations). Concerning the role of RBMs in educating referring physicians, provider scoring was significantly below self-scoring, showing general dissatisfaction with RBMs’ performance. Limitations and Conclusions Rating compliance through surveys and interviews is subjective. While the RBMA took steps to minimize this, the results probably reflect some variability. Provider scores could also show variation caused by the specifications of payors. Utilization-management programs that were initially much the same could have payor-defined customizations that affected provider scoring. The provider scores are the averages of five observations per RBM and might have changed if more observations had been collected, but the surveyed providers represented large numbers of facilities and procedures. The study did not address clinical decision support, widely considered a viable alternative to RBMs in making sure that advanced imaging is utilized correctly. A major observation is the significant difference between how the RBMs score themselves and how providers score them. Every RBM scored about 20% lower on the provider survey than in its self-reported scores. HealthHelp follows the guidelines most closely overall. MedSolutions, AIM, and NIA constitute a middle group, all having scores very close together. CareCore is the lowest-scoring RBM and the one that appears to be most at odds with the guidelines. Between the highest and lowest finishers, however, the difference in compliance is only 8% for self-scoring and 2% for provider scoring. The survey results show that providers do not differentiate greatly among RBMs. This tight grouping of responses probably reflects the competitive RBM marketplace, which has little perceived product differentiation. In general, the RBMs agree with most of the best practices developed by the ACR and the RBMA. Standards for consistency of program deployment, self-referral, and credentialing are widely accepted by the RBMs. The disparities seem to appear as a result of differing standards imposed by the payors employing the RBMs. Accordingly, this might be common ground where RBMs and radiology providers can work together to educate payors on these best practices and why they should be followed. Click here for PDF of table Christie James, MS, is chair of the RBMA Payor Relations Committee and is group practice management manager, radiology business services, for the Massachusetts General Physicians Organization in Boston. Larry Buchwalter, JD, is chair of the committee’s RBM ad hoc writing group and CEO of Stilwell Enterprises LLC, Ridgewood, New Jersey. Michael Mabry is executive director of the RBMA. Ron Howrigon is president of Fulcrum Strategies, Raleigh, North Carolina. This article has been excerpted from Radiology Business Management Association Report on Radiology Benefit Management Companies and their Comparison to ACR and RBMA RBM Best Practices Guidelines, which was prepared by the committee and Fulcrum Strategies and was released on February 17, 2012.


The RBMs Respond

At the request of the RBMA, Radiology Business Journal gave representatives from each RBM company assessed in the report the opportunity to respond (in fewer than 250 words). Their comments follow. AIM is the leading specialty benefit management company, working with 42 health plans to ensure clinically appropriate high-tech imaging is performed for 32 million members in 50 states. We manage a total of $8 billion in outpatient health-plan spending. Our clinical guidelines are applied consistently across health-plan clients; however, the administrative execution of the program can vary based on plan preferences and requirements. AIM’s evidence-based clinical guidelines are frequently reviewed and updated by physicians across a number of specialties. In 2012, AIM is working with a leading meaningful-use company to enhance the structure and usability of our guidelines further and to make the ordering process and decisions rendered more transparent for providers. AIM, in addition to clinical appropriateness review, is leveraging self- reported imaging-site capability data, including accreditation, complemented with average unit cost to support informed decision making around imaging-site selection. Our transparency and engagement program proactively and consistently shares imaging-site information with ordering providers and members and supports high-value imaging-site choices. AIM strives to make the review process convenient and responsive for ordering providers, while retaining clinical credibility. Our success is evident in our December 2011 Provider Survey results; 97% of 10,000 providers surveyed are satisfied with the AIM processes and tools. As a pioneer in Web order entry, we are now receiving 60% of orders via our convenient 24/7 Web portal; for some health plans, this number has reached close to 80%. We value the input from the RBMA and its providers and will continue to deliver a program that ensures appropriate, safe, and affordable imaging. -Brandon Cady President and CEO AIM


The report indicates that rendering which one is the most appropriate and, radiology sites preferred coverage of a family of codes as opposed to specific codes. Evidence-based medicine, however, as applied to the physician ordering the imaging study, recommends the most appropriate procedure (or code) for a specific clinical problem. Evidence- based medicine does not recommend a family of procedures or codes unless they are all considered to be of equal value to the patient. It ranks procedures from most appropriate to least appropriate to inappropriate. CareCore National researches and establishes criteria for the most appropriate procedure (CPT® code) for a given medical condition, when that distinction can be made. Thus, our criteria may be different, by individual CPT code, within a family of codes. The current ACR appropriateness criteria also rank procedures or codes by for example, distinguish between contrast and noncontrast codes. We disagree with the standard that supports approval for a family of codes, as it is not always consistent with the recommendations of evidence-based medicine. Our program does permit and encourage rendering radiology sites to change codes, if medically necessary. The program permits this change prior to imaging; during imaging (that is, while the patient is in the imaging facility); or after imaging has been completed. We encourage radiologists to complete studies during the first patient visit and avoid recommendations to have the patient return for another study from the same family of codes on a different day. If, however, a radiology provider requests a change of code, it must be consistent with the evidence-based medical criteria. Shelley Nan Weiner, MD, FACR Executive Vice President CareCore National


The HealthHelp team was so pleased to learn that the RBMA had granted our company the top spot in its first-ever best-practices benchmark report. Each day, we strive to meet the standards and best- practice criteria created by the ACR and the RBMA, and having the highest scores in the overall ratings confirms that HealthHelp continues to lead the RBM industry and encourages us to think of the effect on our providers every day as we design and implement our workflows. We also appreciate the report’s objective and detailed feedback on areas where we— and all RBM companies—can improve. Striving to maintain HealthHelp’s solid relationship with the radiology industry on matters of policy and procedures will continue to be a top goal for 2012. This year also will be one of growth for HealthHelp. Our established and effective programs in the areas of diagnostic imaging, oncology, cardiology, emergency medicine, and pain management/spine/ joint care will continue to ensure patients get the right tests and treatments at the right times, helping our provider clients to lower costs, improve care quality, and prevent illness. New areas and enhanced features will further expand our ability to assist clients and to have a positive effect on our country’s ever-changing health-care industry. Cherrill Farnsworth President and CEO HealthHelp


As noted in the introduction to your report, the expanding array of imaging technologies has increased both cost and utilization dramatically. That being the case, it is understandable that private insurers, and an increasing number of government payors such as Medicare and Medicaid, have partnered with RBMs to help ensure the appropriate, cost-effective use of imaging services. We applaud your efforts to measure best practices, and while this study is certainly a step in the right direction, we would like to comment on a few concerns that we have with your effort. First, the practitioners surveyed for the study were primarily rendering providers rather than ordering physicians. Some RBMs, including NIA, concentrate their efforts on ordering or referring providers as a matter of policy, since they have a more complete understanding of the patient’s clinical status and history. Second, the small number of providers sampled for this study (five provider groups) is a very small sample from which to obtain a reliable measure of how providers feel about RBMs. Our hope is that with future studies, the sample size will be larger and more diverse, therefore providing a more accurate reflection of provider interaction with RBMs that we can reliably use to adjust our processes. Third, in the spirit of bilateral transparency, the custom in health care would be for such survey tools to be crafted, distributed, and compiled by parties whose independence is explicit and unquestioned. At NIA, we share your interest in ensuring the affordability, efficiency, and (above all) quality of these services, and we look forward to working with you on these issues. Thomas Dehn, MD, FACR Executive Vice President and CMO Michael Pentecost, MD, FACR Associate CMO NIA

Trimed Popup
Trimed Popup