How to use simple checklists to decrease technologist error rates

A simple error-tracking program improved technologists’ image capture at a level 1 trauma center, according to an article published in the Journal of the American College of Radiology.

Digitized imaging has relegated radiologists to the reading room, often separated from the technologists who are responsible for interacting with the patient and performing the imaging exam. This physical distance has dampened the amount of radiologist-technologist communication, resulting in reduced feedback to technologists on their image quality.

Prior studies have shown associations between variations in technologist performance and decreased image quality, so researchers from Children’s Hospital Los Angeles (CHLA) devised a workflow-integrated quality improvement system.

“We developed a standardized template to record quality errors during report dictation,” wrote lead author and resident physician at CHLA Justin Glavis-Bloom, MD, et al. “This template was incorporated into our dictation system, Nuance Powerscribe 360 version 2.5, and inserted into the body of the report only when the radiologist identified an error he or she chose to document.”

Errors available for selection include:

  • Dose/fluoroscopic time not given
  • Patient positioning
  • Imaging, technical, or metallic artifact
  • Magnification
  • Patient motion
  • Exposure

After a three-month pilot and data-gathering period, image quality data was summarized and the most common errors were printed on physical checklists placed on radiography units and posted around the radiology department. Individual technologist’s error rates and specific flawed exams were provided to the technologist supervisor, who reviewed those images with the techs to facilitate quality improvement. In fact, technologists whose error rates remained below 3 percent for the year were promised gift cards to an unspecified location, according to the article.

Researchers found a baseline error rate of 2.7 percent during the three-month pilot period, but that rate dropped to 0.9 percent after the technologist improvement initiatives had a chance to mold behavior, according to the article. The most common types of errors include exposure, patient positioning, field of view, technique, and overlying artifact.

“In addition, the proportion of technologists with error rates exceeding 3 percent decreased from 28 percent during the initial three-month pilot period to 5 percent during the final six months of 2016, illustrating that the decrease in overall error rate resulted from improvements across technologist staff,” wrote Glavis-Bloom et al.

Image retake trended upward, increasing from 7 percent during the pilot period to eight percent in the final months of recording data. While the increase was negligible, image retake rates are just as important as error rates, according to the authors.

“To ensure that decreases in errors in one domain are not being offset by increases in another, it is important during image QI efforts to regularly track image retake rate to protect patients from increased radiation exposure,” wrote Glavis-Bloom et al.

However, radiologist participation was sometimes hard to nail down, the authors wrote. Only 55 percent of radiologists participated consistently, even with a system that captured feedback top technologists within a normal workflow.  Even with less-than-perfect engagement, image quality error rates decreased, underscoring the importance of integrating the reporting software into usual practices. 

As a Senior Writer for TriMed Media Group, Will covers radiology practice improvement, policy, and finance. He lives in Chicago and holds a bachelor’s degree in Life Science Communication and Global Health from the University of Wisconsin-Madison. He previously worked as a media specialist for the UW School of Medicine and Public Health. Outside of work you might see him at one of the many live music venues in Chicago or walking his dog Holly around Lakeview.

Trimed Popup
Trimed Popup