Imaging groups throughout the United States have moved to standardized radiology reports in recent years, and it’s a trend that continues to pick up steam. One side effect of this change is that leaders must then perform long, labor-intensive manual audits of their team’s reports to confirm compliance. But what if groups could somehow perform an automated audit, making those pesky manual audits a thing of the past?
According to new research published in the Journal of the American College of Radiology, software using natural language processing and machine learning algorithms can accurately audit radiologist compliance with report templates. The study’s authors audited radiology reports from October 2015 at their facility both automatically—using this new software—and manually—looking at 25 reports for each of the faculty’s 42 members.
Overall, the manual audit found a compliance rate of 91.2 percent, with a confidence interval of 89.3 percent to 92.8 percent. The automated audit, meanwhile, found a compliance rate of 92 percent, which sits firmly in that interval.
“Related to the particular issue of faculty compliance with standardized reports, this automated process can save significant labor expenses related to performing such audits,” wrote lead author Lane F. Donnelly, MD, of the department of radiology at Stanford University in Stanford, California, and colleagues. “The information can be used as a quality indicator for radiology dashboards, incentive programs, or practitioner evaluation, such as Ongoing Professional Practice Evaluation. Feedback about individual faculty member compliance with use of standardized reports can be particularly helpful during the period when a department is transitioning from free dictation of reports to use of standardized templates.”
Donnelly et al. also wrote that they have witnessed the effectiveness of standardized reports firsthand: Since the initial implementation of their report templates, faculty compliance jumped from 91.2 percent in October 2015 to 99.5 percent in June 2017.
The authors added that their study had several limitations. For instance, only a single month’s radiology reports were studied, and the software has yet to be tested on a “real-time basis.”