Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2008 May 9:8:29.
doi: 10.1186/1471-2288-8-29.

Examining intra-rater and inter-rater response agreement: a medical chart abstraction study of a community-based asthma care program

Affiliations
Free PMC article

Examining intra-rater and inter-rater response agreement: a medical chart abstraction study of a community-based asthma care program

Teresa To et al. BMC Med Res Methodol. .
Free PMC article

Abstract

Background: To assess the intra- and inter-rater agreement of chart abstractors from multiple sites involved in the evaluation of an Asthma Care Program (ACP).

Methods: For intra-rater agreement, 110 charts randomly selected from 1,433 patients enrolled in the ACP across eight Ontario communities were re-abstracted by 10 abstractors. For inter-rater agreement, data abstractors reviewed a set of eight fictitious charts. Data abstraction involved information pertaining to six categories: physical assessment, asthma control, spirometry, asthma education, referral visits, and medication side effects. Percentage agreement and the kappa statistic (kappa) were used to measure agreement. Sensitivity and specificity estimates were calculated comparing results from all raters against the gold standard.

Results: Intra-rater re-abstraction yielded an overall kappa of 0.81. Kappa values for the chart abstraction categories were: physical assessment (kappa 0.84), asthma control (kappa 0.83), spirometry (kappa 0.84), asthma education (kappa 0.72), referral visits (kappa 0.59) and medication side effects (kappa 0.51). Inter-rater abstraction of the fictitious charts produced an overall kappa of 0.75, sensitivity of 0.91 and specificity of 0.89. Abstractors demonstrated agreement for physical assessment (kappa 0.88, sensitivity and specificity 0.95), asthma control (kappa 0.68, sensitivity 0.89, specificity 0.85), referral visits (kappa 0.77, sensitivity 0.88, specificity 0.95), and asthma education (kappa 0.49, sensitivity 0.87, specificity 0.77).

Conclusion: Though collected by multiple abstractors, the results show high sensitivity and specificity and substantial to excellent inter- and intra-rater agreement, assuring confidence in the use of chart abstraction for evaluating the ACP.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Chart abstraction form and sample portion of a fictitious medical chart used for assessing inter-rater reliability.
Figure 2
Figure 2
Ranges of inter-rater agreement among pairs of abstractors. Data are shown as percentage agreement (%), kappa coefficient, and 95% confidence interval for kappa. Abbreviations: Std = Gold Standard.

Similar articles

Cited by

References

    1. Wu L, Ashton CM. Chart review. A need for reappraisal. Eval Health Prof. 1997;20:146–163. doi: 10.1177/016327879702000203. - DOI - PubMed
    1. Gilbert EH, Lowenstein SR, Koziol-McLain J, Barta DC, Steiner J. Chart reviews in emergency medicine research: Where are the methods? Annals of Emergency Medicine. 1996;27:305–308. doi: 10.1016/S0196-0644(96)70264-0. - DOI - PubMed
    1. Beard CM, Yunginger JW, Reed CE, O'Connell EJ, Silverstein MD. Interobserver variability in medical record review: an epidemiological study of asthma. Journal of Clinical Epidemiology. 1992;45:1013–1020. doi: 10.1016/0895-4356(92)90117-6. - DOI - PubMed
    1. Yawn BP, Wollan P. Interrater reliability: completing the methods description in medical records review studies. American Journal of Epidemiology. 2005;161:974–977. doi: 10.1093/aje/kwi122. - DOI - PubMed
    1. Boyd NF, Wolfson C, Moskowitz M, Carlile T, Petitclerc C, Ferri HA, Fishell E, Gregoire A, Kiernan M, Longley JD. Observer variation in the classification of mammographic parenchymal patterns. J Chronic Dis. 1986;39:465–472. doi: 10.1016/0021-9681(86)90113-X. - DOI - PubMed

Publication types