Identifying benchmarks for discrepancy rates in preliminary interpretations provided by radiology trainees at an academic institution

J Am Coll Radiol. 2011 Sep;8(9):644-8. doi: 10.1016/j.jacr.2011.04.003.


Purpose: At many academic medical centers, radiology house staff provide preliminary interpretations for imaging studies after hours, the accuracy and timely availability of which are crucial to patient care. Nevertheless, these preliminary interpretations are sometimes discrepant with finalized attending reports. The rate of such discrepancies can provide valuable information for quality improvement. The aim of this study was to identify specific benchmarks for resident discrepancy rates by reviewing all 73,072 on-call reports generated at the authors' institution over 1 year.

Methods: A custom-built interface called Orion was used to track all on-call reports generated in 2010. Reports graded as discrepant with major changes during attending review were automatically identified. The turnaround time (TAT) of all reports was measured. These data were used to identify specific benchmarks for resident performance on call.

Results: A total of 45,608 of 73,072 preliminary dictations (62%) were interpreted by residents; of these, 407 (0.89%) had major discrepancies. The major discrepancy rates varied among individual residents (0.2% to 1.8%), modalities, and level of resident training. On the basis of distributions, major discrepancy benchmarks were established for overall rate (1.7%) and for the modalities of conventional radiography (1.5%), CT (4.0%), and ultrasound (4.0%). The mean TAT was significantly shorter for the emergency department (46 minutes) than for inpatient services (144 minutes). A benchmark TAT of 1 hour has been adopted for all imaging studies performed through the emergency department.

Conclusions: Identifying benchmarks for major discrepancy rates and TAT of preliminary interpretations by radiology trainees is a valuable first step for individual and departmental quality improvement.

MeSH terms

  • Academic Medical Centers*
  • After-Hours Care
  • Benchmarking*
  • Clinical Competence*
  • Diagnostic Errors / statistics & numerical data*
  • Humans
  • Internship and Residency / statistics & numerical data*
  • Quality Assurance, Health Care / statistics & numerical data*
  • Radiology / education
  • Radiology / statistics & numerical data*
  • Reproducibility of Results
  • Software