Context: The Royal College of Physicians and Surgeons of Canada (RCPSC) CanMEDS framework is being incorporated into specialty education worldwide. However, the literature on how to evaluate trainees in the CanMEDS competencies remains sparse.
Objectives: The goals of this study were to examine the assessment tools used and programme directors' perceptions of how well they evaluate performance of the CanMEDS roles in Canadian postgraduate training programmes.
Methods: We conducted a web-based survey of programme directors of RCPSC-accredited training programmes. The survey consisted of two questions. Question 1 was designed to establish which assessment tools were used to assess each of the CanMEDS roles. Question 2 was intended to assess programme directors' perceived satisfaction with CanMEDS evaluation in their programmes.
Results: A total of 149 of the eligible 280 programme directors participated in the survey. Programme directors used a variety of assessment tools to evaluate trainees in CanMEDS competencies. Programmes used more tools to evaluate the Medical Expert (mean = 4.03, standard deviation [SD] = 1.59) and Communicator (mean = 2.36, SD = 1.02) roles. Programme directors used the fewest tools for the Collaborator (mean = 1.75, SD = 1.10) and Manager (mean = 1.75, SD = 1.18) roles. More than 92% of the programmes used in-training evaluation reports to evaluate all the CanMEDS roles. Programme directors were satisfied with their evaluation of the Medical Expert role, but less so with assessment of the other CanMEDS competencies.
Conclusions: This study demonstrates that Canadian postgraduate training programmes use a variety of assessment tools to evaluate the CanMEDS competencies. Programme directors are neutral or concerned about how the CanMEDS roles other than that of Medical Expert are evaluated in their programmes. Further efforts are required to establish best practice in CanMEDS evaluation.