Objective: In medical education the focus has shifted from gaining knowledge to developing competencies. To effectively monitor performance in practice throughout the entire training, a new approach of assessment is needed. This study aimed to evaluate an instrument that monitors the development of competencies during postgraduate training in the setting of training of general practice: the Competency Assessment List (Compass).
Methods: The distribution of scores, reliability, validity, responsiveness and feasibility of the Compass were evaluated.
Results: Scores of the Compass ranged from 1 to 9 on a 10-point scale, showing excellent internal consistency ranging from .89 to .94. Most trainees showed improving ratings during training. Medium to large effect sizes (.31-1.41) were demonstrated when we compared mean scores of three consecutive periods. Content validity of the Compass was supported by the results of a qualitative study using the RAND modified Delphi Method. The feasibility of the Compass was demonstrated.
Conclusion: The Compass is a competency based instrument that shows in a reliable and valid way trainees' progress towards the standard of performance.
Practice implications: The programmatic approach of the Compass could be applied in other specialties provided that the instrument is tailored to specific needs of that specialism.
Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.