Can electronic triggers help prevent delays in patient care?

Developing electronic triggers to detect delays in follow-up of abnormal mammographic results offers healthcare providers with an “unprecedented opportunity to improve care,” according to a new study published in the Journal of the American College of Radiology. But are the triggers successful enough for clinical use?

To find out, the authors first developed an electronic trigger that flags patient records including 1) abnormal mammographic results and 2) a delay in follow-up. The trigger was then tested using data from more than 350,000 patients seen from January 2010 to May 2015 at Veterans Affairs (VA) facilities throughout the United States.

Overall, out of 400 patient records flagged for review by the trigger and analyzed by the study’s authors, 283 true positives were identified. This means the trigger detected delays with a positive predictive value (PPV) of 71 percent, well above the predetermined threshold for the algorithm to be “practical for clinical use” of 50 percent.

“This suggests that triggers are practical for clinical application even when a red flag is identified at a site external to the clinic setting, provided results are reliably received and transcribed into the electronic health record,” wrote Daniel R. Murphy, MD, MBA, department of medicine at Baylor College of Medicine in Houston, and colleagues. “This study lays the groundwork for future testing of a prospective mammography trigger that monitors clinical data in real time and alerts personnel as soon as potential delays are detected.”

Murphy et al. described the high number of delays in patient care as “surprising,” saying the number of patient records identified by their trigger should have, ideally, been much closer to zero. “This suggests failures in communication and coordination of follow-up care between radiologists, primary care providers, and patients,” they wrote.

The authors added that their study did have limitations. For example, “the unique characteristics of VA facilities” might mean the study’s findings are significant different than had the data been from other facilities from throughout the country.