Examining intra-rater and inter-rater response agreement: a medical chart abstraction study of a community-based asthma care program

BMC Med Res Methodol. 2008 May 9:8:29. doi: 10.1186/1471-2288-8-29.

Abstract

Background: To assess the intra- and inter-rater agreement of chart abstractors from multiple sites involved in the evaluation of an Asthma Care Program (ACP).

Methods: For intra-rater agreement, 110 charts randomly selected from 1,433 patients enrolled in the ACP across eight Ontario communities were re-abstracted by 10 abstractors. For inter-rater agreement, data abstractors reviewed a set of eight fictitious charts. Data abstraction involved information pertaining to six categories: physical assessment, asthma control, spirometry, asthma education, referral visits, and medication side effects. Percentage agreement and the kappa statistic (kappa) were used to measure agreement. Sensitivity and specificity estimates were calculated comparing results from all raters against the gold standard.

Results: Intra-rater re-abstraction yielded an overall kappa of 0.81. Kappa values for the chart abstraction categories were: physical assessment (kappa 0.84), asthma control (kappa 0.83), spirometry (kappa 0.84), asthma education (kappa 0.72), referral visits (kappa 0.59) and medication side effects (kappa 0.51). Inter-rater abstraction of the fictitious charts produced an overall kappa of 0.75, sensitivity of 0.91 and specificity of 0.89. Abstractors demonstrated agreement for physical assessment (kappa 0.88, sensitivity and specificity 0.95), asthma control (kappa 0.68, sensitivity 0.89, specificity 0.85), referral visits (kappa 0.77, sensitivity 0.88, specificity 0.95), and asthma education (kappa 0.49, sensitivity 0.87, specificity 0.77).

Conclusion: Though collected by multiple abstractors, the results show high sensitivity and specificity and substantial to excellent inter- and intra-rater agreement, assuring confidence in the use of chart abstraction for evaluating the ACP.

Publication types

  • Evaluation Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Abstracting and Indexing / methods*
  • Asthma / therapy*
  • Community Health Services*
  • Documentation / methods*
  • Follow-Up Studies
  • Humans
  • Medical Records*
  • Observer Variation*
  • Ontario
  • Outcome and Process Assessment, Health Care / methods*
  • Pilot Projects
  • Quality Control
  • Reproducibility of Results