COGR peer review: Worth the extra effort

Consensus-oriented group review (COGR) is valuable for radiologists and worth the additional effort compared to other peer review methods, according to a recent study published by the Journal of the American College of Radiology.

COGR peer review involves a group of radiologists meeting regularly to review cases selected at random, with participants recording a consensus on the “acceptability” of the cases using specific software.

H. Benjamin Harvey, MD, JD, department of radiology of Massachusetts General Hospital in Boston, and colleagues examined the effectiveness of this peer review method within a large radiology department made up of more than 100 staff radiologists.

More than 11,000 CT, MRI, and ultrasound studies interpreted by 83 different radiologists were peer reviewed from Oct. 1, 2011, to Sept. 30, 2013. On average, each radiologist participated in 112 COGR sessions, and had more than 3 percent of his or her own studies peer reviewed in that two-year span.

The authors noted that the discordance rate of the study was 2.7 percent, meaning that a total of 306 studies were labeled “report should change” by the peer review group. The discordance rate was highest for the musculoskeletal division (5.4 percent) and the abdominal division (5.3 percent). It was lowest in the emergency division (0.5 percent).

“The significant divisional differences in discordance rates identified in our study raise concerns for potential sources of underreporting in some divisions,” the authors wrote. “Further research is necessary to elucidate factors underlying discordance rate differences between varying methods of peer review and varying groups participating in the same peer review method.”

Overall, Harvey et al. concluded that COGR is effective, even if it takes more effort than some other peer review models.

“COGR places increased importance on peer-to-peer discussions of cases, making it more cumbersome than RADPEER-style models of peer review,” the authors wrote. “As such, COGR would likely require a departmental commitment of radiologist ‘full-time equivalent’ efforts toward peer review an order of magnitude higher. Given the increasing work volumes and shrinking margins facing many radiology groups, such a commitment may appear daunting. Nevertheless, our department believes that COGR delivers sufficient value to merit this time commitment.”

The authors added that they know some physicians remain unsure about COGR, fearful that their peer review findings could be used in legal battles over malpractice. But, they said, COGR should not scare anyone as long as facilities are careful before beginning implementation.

“The COGR process was consciously designed to ensure that information disclosed during peer review and any subsequent communications would remain protected under our state’s statutory peer review privilege,” the authors wrote. “Any efforts to implement COGR in other radiology departments should be done with a keen awareness of state and national medical peer review laws, so that the peer review activities are afforded all statutory protections available to them.”

The authors wrote that their study did have multiple limitations. For example, it was limited to a single academic institution, so it remains unclear how well the results represent the medical community as a whole. Also, radiologists were offered financial incentives to participate in the first year of the study.

“Even though the peer review program continues to thrive in the absence of such incentives, it remains unclear whether the cultural shift necessary to have effective COGR review could have been as quickly achieved in their absence,” the authors wrote.

Michael Walter
Michael Walter, Managing Editor

Michael has more than 16 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Trimed Popup
Trimed Popup