Using Patient Registries to Improve Care: PQRS, MQSA and Beyond

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon
patient-registry.jpg - patient registry

Patient data registries have become the cornerstone of accountable care in a variety of medical disciplines; radiology is no exception. Over the past few years, small and large practices alike have begun to embrace the registry concept, both as a stepping stone toward compliance with government regulations, but also to improve fundamentally the caliber of care they deliver.

Radiology’s most sophisticated registry option currently is the National Radiology Data Registry™ (NRDR), developed by the American College of Radiology. Judy Burleson, MHSA, senior advisor of quality metrics at the ACR, describes the NRDR as a “warehouse of registries” that compares imaging facilities regionally and nationwide, according to type.

Seven registries fall under the NRDR umbrella, with access to all available through a single front-end online portal. Radiologists and imaging facilities have the option to participate in any or all of these registries: CT colonography registry; Dose Index Registry (DIR); General Radiology Improvement Database (GRID); Interventional Radiology Registry; Lung Cancer Screening Registry (LCSR); National Mammography Database (NMD); and the PQRS Qualified Clinical Data Registry (see descriptions, page 30).

Customizing compliance

Not only does NRDR participation support compliance with the requirements of government quality programs, it also “customizes” the government quality metrics to be specialty-relevant. Notably, the NRDR has, once again for 2016, been approved as a Qualified Clinical Data Registry (QCDR) for the CMS Physician Quality Reporting System (PQRS).

Under the 2016 Medicare Physician Fee Schedule Final Rule, eligible professionals and group practices can meet PQRS reporting requirements by participating in a QCDR, thereby avoiding two key adjustments that would otherwise reduce their payments for service: a 2% PQRS 2018 payment adjustment, and a 4% Value Modifier automatic non-reporting payment adjustment.

To reap these benefits, eligible providers must report at least nine measures covering three National Quality Strategy (NQS) domains for at least 50% of applicable patients (all patients, rather than Medicare patients alone) seen during the 2016 participation period. The requirements also stipulate reporting on at least two outcome measures or, if these are unavailable, a minimum of one outcome measure and one resource use, patient experience-of-care, efficiency appropriate-use or patient safety measure.

In a new twist for the 2016 program year, group practices can utilize the QCDR to meet PQRS reporting requirements, with the proviso that at least 50% of their eligible providers participating as a group practice reporting option (GPRO) satisfy the PQRS reporting option to avoid a negative payment adjustment. Burleson suggests that in all cases, practitioners and radiology groups harness the QCDR Measure Selection Tool available on the ACR’s website to identify the specific measures that will meet the reporting requirements applicable to their situation.

Michael Bohl, FRBMA, CEO, Radiology Group PC, Davenport, Iowa, says his practice “would never have been able to get to nine measures” had it not begun leveraging the QCDR for PQRS compliance purposes. “In the years leading up to 2015, we reported via a claims-based system before transitioning to a Qualified Clinical Data Registry, but both reporting methods limited us to basic PQRS measures.” 

“In early 2015, we were resigned to not being able to report the required nine measures for 2015 because of the limitations of the basic PQRS measures,” Bohl continues. “That was until we enrolled in the ACR’s Qualified Clinical Data Registry.”

The QCDR dramatically expanded the measures available to the practice and allowed Radiology Group to successfully report the nine measures required in 2015. The practice successfully avoided the 2% PQRS negative payment adjustment in 2017 (based on 2015 data), he reports.

“Keeping us from payment cuts has been a powerful benefit of participating in this particular patient registry and using it to report for PQRS,” concurs Christine Keefe, CPA, CMPE, CFO, Metro Imaging, St. Louis, Mo.,  which has been using the NRDR since 2014.

Burleson points out that the registries can play a part in attestation for the meaningful use (MU) program as well. One MU set measure calls for reporting specific cases to a specialized registry. A number of NRDR registries, including the GRID and the NMD, can support such reporting.

Additionally, Burleson notes, the NMD can assist imaging service providers in complying with the Mammography Quality Standards Act (MQSA), which requires mammography facilities to meet uniform quality standards. This is because the registry provides comparative information for national and regional benchmarking.

The LCSR, meanwhile, is approved by CMS to allow providers to meet quality reporting requirements to receive Medicare CT lung cancer screening payment. The American Board of Radiology (ABR) has qualified the CTC registry as meeting the criteria for practice quality improvement (PQI), toward the purpose of fulfilling requirements set forth in the ABR Maintenance of Certification Program.

Beyond compliance

The value of patient registries, however, transcends compliance with government quality programs. Most significant, Burleson says, are enhancements in the caliber of patient care. “The primary benefit, as we see it, is overall quality improvement, and that is why the registries were implemented,” she asserts. Clinicians can look at data to see how they compare to others, as well as at what has changed within their own (organizations). They can then use what they have learned to alter or modify procedures and practices.

Burleson cites the DIR as an example. Leveraging this registry, clinicians would be able to assess how the radiation dose administered at their particular practice for a specific procedure or procedures stacks up against the aggregate, as well as against the radiation dose administered at identical imaging facilities (e.g., academic, community) and in imaging facilities in the same geographic region. Should a review of the data uncover a substantially different than average effective dose for the exam in question, that would be the cue to look closely at the protocol for that type of exam and determine whether it in warrants modification.

“Without such a comparison, it is more difficult to see where radiation dosages need to be addressed, and that can have an impact on patient care,” compliance requirements or no compliance requirements, Burleson observes. She adds that the geographic parameter in the DIR was added to eliminate certain population-related variables from the equation, increasing its applicability as a dose appropriateness assessment tool across a diverse group of end-users.

For instance, she says, the effective radiation dose administered to a patient who resides in the Southeastern U.S. and is undergoing a CT of the chest or abdomen may be greater than the effective radiation dose administered to a patient who lives in the Western U.S. and is undergoing the identical procedure. Residents of the former area typically are of greater girth than residents of the latter area, so practitioners in the Western U.S. would not necessarily want to consider, in identifying potential effective radiation dose adjustments, abdominal or chest CT radiation doses administered to heavier patients.

Gregory N. Nicola, MD, vice president, Hackensack Radiology Group, Oradell, N.J., concurs. He notes that in affording radiologists, imaging centers and hospital-based imaging players an avenue for formulating and comparing national benchmarks, registries set the stage for improving the caliber of care delivered to patients. “The term ‘quality’ can be tricky to interpret, and it can be difficult to determine exactly what needs to be done to achieve or improve it unless you’re looking at objective measures, instead of through your own lens,” Nicola asserts. “For instance, without the National Mammography Database in place, who really is to say that the mammography recall rate for a particular practice isn’t high enough—or too high, for that matter—to contribute to quality care?”

He adds that increasing the automation of objective quality metrics through participation in such registries as those that fall under the NRDR umbrella also gives practitioners additional time for conducting internal peer reviews and leveraging internal quality control programs. This, too, has a heavy bearing on caliber of care and the overall patient experience.

Bohl, too, perceives participation in patient data registries as tightly connected to patient care enhancements. Next year, Radiology Group will begin to participate in the lung cancer registry, as well as in the DIR. Consistent utilization of the assessment categories in the LCSR will, among other benefits, aid Radiology Group’s clinicians in identifying deficiencies and facilitating research, both of which Bohl deems prerequisites for high-caliber care. 

For its part, Metro Imaging LLC, has seen several notable positive changes on the patient care front through registry participation. A close look at the DIR and a comparison of radiation doses by scanner enabled the practice to optimize doses by scanner and “improve doses where they may have been ‘off,’” Keefe notes. Meanwhile, use of the GRID yielded the evidence needed to initiate improvements in study turnaround times.

She adds that boarding the patient registry train also renders it easier to prove value to payors, which will become increasingly important as the healthcare system shifts to value-based payments. Burleson concurs, adding that proving value to hospitals ranks among the benefits ACR emphasizes on its list of advantages of using a QCDR: demonstrating value to hospitals via QI registry data contained therein.

The broader mission

Then, there is support for the broader mission of population health management and improvement. Just ask Mitchell D. Schnall, MD, PhD, FACR, Eugene D. Pendergrass Professor of Radiology and chair of the department of radiology, Perelman School of Medicine, University of Pennsylvania (Penn Medicine) and group chair, ECOG-ACRIN Cancer Research Group. He sees great potential for the use of registries to better manage and enhance public health by removing flexibility from the radiology reporting equation.

“The culture of radiology centers on variables in reporting—that is, in the way reports are structured and in how they are interpreted,” Schnall says. “That’s no good for public health because it’s hard to see whether, for example, follow-up did happen or if there are too many indeterminate findings.”

Patient registries in radiology, he explains, “standardize benchmarking as has been done in the surgery specialty, “paving the way for the monitoring of radiology outcomes to reduce the volume of unnecessary procedures and ensure that follow-up occurs when it is truly warranted. This, according to Schnall, is a critical element in population health management.

Penn Medicine has developed its own patient registry database for the management of incidental findings (incidentalomas). The impetus for the move was significant concern about the considerable volume of “generally unnecessary downstream care” being undergone by patients whose incidental findings did not need to be [handled].” That incidentaloma findings appeared to be “falling through the cracks” spurred the initiative as well.

Penn Medicine has created a structure for radiologist reporting of incidentalomas found in the liver, pancreas, kidney and adrenal glands. In every reporting cross-section of CT, MRI and ultrasound, radiologists now utilize a structured element assessment with respect to the presence of incidentalomas: not found; found, but clearly benign; found, with an indeterminate need for follow-up; and suspicious and therefore requiring follow-up.

“An important component of population health management and improvement is seeing where the deficiencies are, and we are able to (pinpoint) them a lot better now,” Schnall states.

The use of Penn Medicine’s registry has expanded, with two other hospital systems—Geisinger Health System and Penn State Hershey Health System—now leveraging it. Penn Medicine also has submitted a proposal to the American College of Radiology Center for Research and Innovation to obtain funding to expand the registry to other healthcare institutions.

Once the radiology specialty em- braces structured reporting in general, it will be easier for patient data registries to play a major role in public health management and improvement, Schnall observes. However, he emphasizes, “this is a cultural shift that hasn’t happened yet. It needs time, and it will happen gradually” as other benefits of participation in patient data registries introduced by the ACR and likely other entities are realized.

In a slightly different vein, Bohl attributes at least one operational improvement at his facility to its involvement with the PQRS. Participating in this particular patient data registry, he says, is far less time-consuming than engaging in the manual processes that would be necessary in order to configure the practice’s billing system to flag PQRS codes for the purpose of generating insurance claims.

Beginning at the end

Not surprisingly, practitioners and groups will need to follow best practices in getting started with patient registries, and can expect to encounter some challenges along the way. Lisa Mead, RN, MS, CPHQ, president, Crowne Healthcare Advisors, Scottsdale, Ariz., advocates that prospective users of any patient data registry adopt a “planned approach” wherein they first determine what they want to do or accomplish with the data.

For example, the goal might be effecting safety improvements, increasing the caliber of the patient experience,

improving the efficiency of services rendered or sharing data in committees or at the individual physician level. “Once you have that, you can look at where the data is coming from and determine whether you have the database programs to extract it,” she says. “The key is to begin with the end in mind.”

Mead also recommends that practices consider “benchmarking against themselves” before trying to benchmark against other entities, as it facilitates the process of working with the data. Just as significantly, she emphasizes the importance of submitting clean data, rather than dirty data, to registries. “If something is ‘off,’ it could disturb everything,” defeating the entire purpose of registry participation. A data-cleansing software program may be needed to ensure that this is the case. “Don’t just do a data dump,” Mead warns.

Sources say a data scientist is not needed for patient data registry participation, but it will likely be necessary to enlist vendor assistance on certain fronts; for instance, help in figuring out how to extract data from the RIS. “The ACR offers a lot of help, and now we’re gaining traction with vendors as well,” Nicola states. One company, he reports, has developed a product that will electronically capture most registry metrics and transmit them to the NRDR.

As for potential obstacles, Nicola notes that for practices harnessing the QCDR option, CMS requires the submission of data at the patient exam level. This creates a need to build a registry-compatible download from reporting physicians’ electronic health records (EHR) systems when certain measures are selected.

Physicians who are affiliated with large health systems may be able to obtain internal support for such an endeavor, but smaller practices may need to engage their EHR vendor. Nicola himself was forced to approach the IT department at the hospital system with which he is affiliated, and to request experts’ cooperation in “building proprietary downloads out of the EHR,” he recalls.

Bohl says obtaining data needed to report measures for PQRS also can be problematic because some hospitals can be less than cooperative about sharing it. His group had no trouble on this front, but he is aware of others that were less fortunate.  “In these situations, the solution is to frame it as, ‘We want to prove our value, and we need to comply with quality measures; we know you understand the importance because you have to do the same thing, so please can you please help us?’” he elaborates.

Meanwhile, the ACR is working to make registry participation easier. A semi-automatic data upload to the NMD has been developed through partnerships with mammography vendors, Burleson says. The ACR is continuing to team up with vendors to test and validate for electronic data submission, she reports.

“Patient data registries remain a focus for the ACR, given their major role in quality” and compliance, Burleson concludes. “There is more to come.”