Assessment of CanMEDS roles in postgraduate training: the validation of the Compass

Patient Educ Couns. 2012 Oct;89(1):199-204. doi: 10.1016/j.pec.2012.06.028. Epub 2012 Jul 15.

Abstract

Objective: In medical education the focus has shifted from gaining knowledge to developing competencies. To effectively monitor performance in practice throughout the entire training, a new approach of assessment is needed. This study aimed to evaluate an instrument that monitors the development of competencies during postgraduate training in the setting of training of general practice: the Competency Assessment List (Compass).

Methods: The distribution of scores, reliability, validity, responsiveness and feasibility of the Compass were evaluated.

Results: Scores of the Compass ranged from 1 to 9 on a 10-point scale, showing excellent internal consistency ranging from .89 to .94. Most trainees showed improving ratings during training. Medium to large effect sizes (.31-1.41) were demonstrated when we compared mean scores of three consecutive periods. Content validity of the Compass was supported by the results of a qualitative study using the RAND modified Delphi Method. The feasibility of the Compass was demonstrated.

Conclusion: The Compass is a competency based instrument that shows in a reliable and valid way trainees' progress towards the standard of performance.

Practice implications: The programmatic approach of the Compass could be applied in other specialties provided that the instrument is tailored to specific needs of that specialism.

Publication types

  • Evaluation Study

MeSH terms

  • Clinical Competence / standards*
  • Competency-Based Education / methods*
  • Education, Medical, Graduate / methods*
  • Feasibility Studies
  • General Practice / education*
  • Humans
  • Netherlands
  • Physicians*
  • Psychometrics
  • Reproducibility of Results
  • Surveys and Questionnaires