Does it matter whom and how you ask? inter- and intra-rater agreement in the Ontario Health Survey

J Clin Epidemiol. 1997 Feb;50(2):127-35. doi: 10.1016/s0895-4356(96)00314-9.


A large amount of information in the 1990 Ontario Health Survey (OHS) was collected from proxy respondents using questions administered in face-to-face interviews. Can this type of information represent candid self-reported measures of health status? Inter-rater agreement was assessed using Cohen's kappa statistic for responses to questions that were answered both by individuals about themselves and by proxies on their behalf. Intra-rater agreement, assessing the effect of mode of survey administration (in-person interviews versus self-completed written questionnaires) on the responses, was also investigated using the kappa statistic. We conclude that: (1) proxy responses in the OHS for impairments of emotion and pain are not reliable indicators of self-response (kappa < 0.32) because proxy respondents consistently under-report the burden of morbidity; (2) levels of morbidity reported by subjects to interviewer-administered questionnaires may underestimate morbidity, relative to morbidity reported by subjects using self-administered questionnaires completed in privacy. We also hypothesize that the relative magnitudes of inaccuracy introduced by interviewer administration relative to proxy reporting depends on the phenomenon being measured. When assessing pain, mode of administration is quantitatively a more important source of disagreement than type of respondent.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Health Status*
  • Health Surveys*
  • Humans
  • Interviews as Topic
  • Observer Variation
  • Ontario
  • Pain Measurement
  • Reproducibility of Results
  • Surveys and Questionnaires