Is the rating result reliable? A new approach to respond to a medical trainee's concerns about the reliability of Mini-CEX assessment

J Formos Med Assoc. 2022 May;121(5):943-949. doi: 10.1016/j.jfma.2021.07.005. Epub 2021 Jul 19.

Abstract

Purpose: Whether the rating result of mini-clinical evaluation exercise (Mini-CEX) for rating clinical skills is reliable is of a medical trainee's great concerns. The objectives of this study were to analyze the test-retest reliability, interrater reliability and internal consistency reliability of Mini-CEX.

Methods: Three clinical scenarios, each played by a standardized patient and resident, were developed and videotaped. A group of assessors were recruited to rate the resident's clinical skills using Mini-CEX with a nine-point grading scale in each videotaped clinical scenario. Each assessor was required: (1) to watch the videotaped clinical scenarios a sequential order; (2) to rate each medical trainee's clinical skills in each clinical scenario for two rating sessions, and there must be a minimum three-week interval between the first and the second Mini-CEX rating session.

Results: A total of 38 assessors participated in this study. This study showed that: (1) an assessor carried out similar rating reuslts under the same clinical performance based on an acceptable test-retest reliability (Pearson's correlation coefficients = 0.24-0.76, P value=<0.01-0.14); (2) assessors gave similar rating results to a medical trainee's clinical performance based on a good interrater reliability (intra-class correlation coefficient = 0.57-0.83, P value=<0.01-0.03); and (3) the items reflected unidimensionally a construct-a medical trainee's clinical skills based on an excellent internal consistency reliability (Cronbach's alpha = 0.92-0.97).

Conclusion: This study convincingly showed that Mini-CEX is a reliable assessment tool for rating clinical skills, and can be widely used to assess medical trainees' clinical skills.

Keywords: Mini-CEX; Reliability; Workplace-based assessment.

MeSH terms

  • Clinical Competence*
  • Educational Measurement* / methods
  • Humans
  • Reproducibility of Results
  • Videotape Recording