Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Dec;7(6):362-372.
doi: 10.1007/s40037-018-0481-2.

Validity evidence for programmatic assessment in competency-based education

Affiliations
Free PMC article

Validity evidence for programmatic assessment in competency-based education

Harold G J Bok et al. Perspect Med Educ. 2018 Dec.
Free PMC article

Abstract

Introduction: Competency-based education (CBE) is now pervasive in health professions education. A foundational principle of CBE is to assess and identify the progression of competency development in students over time. It has been argued that a programmatic approach to assessment in CBE maximizes student learning. The aim of this study is to investigate if programmatic assessment, i. e., a system of assessment, can be used within a CBE framework to track progression of student learning within and across competencies over time.

Methods: Three workplace-based assessment methods were used to measure the same seven competency domains. We performed a retrospective quantitative analysis of 327,974 assessment data points from 16,575 completed assessment forms from 962 students over 124 weeks using both descriptive (visualization) and modelling (inferential) analyses. This included multilevel random coefficient modelling and generalizability theory.

Results: Random coefficient modelling indicated that variance due to differences in inter-student performance was highest (40%). The reliability coefficients of scores from assessment methods ranged from 0.86 to 0.90. Method and competency variance components were in the small-to-moderate range.

Discussion: The current validation evidence provides cause for optimism regarding the explicit development and implementation of a program of assessment within CBE. The majority of the variance in scores appears to be student-related and reliable, supporting the psychometric properties as well as both formative and summative score applications.

Keywords: Competency development; Learning curves; Outcome-based education; Performance-relevant information; Programmatic assessment.

PubMed Disclaimer

Conflict of interest statement

Conflict of interest

H.G.J. Bok, L.H. de Jong, T. O’Neill, C. Maxey and K.G. Hecker declare that they have no competing interests.

Ethical standards

The Ethical Review Board of the Netherlands Association for Medical Education approved this study (NERB number 884).

Figures

Fig. 1
Fig. 1
Schematic overview of competency-based assessment program at the Faculty of Veterinary Medicine, Utrecht University. Mini-CEX mini clinical evaluation exercise, MSF multisource feedback, SA self-assessment, EBCR evidence-based case report, PDP personal development plan
Fig. 2
Fig. 2
Development of performance (score) dependent of student. The Y‑axis represents the average score of students’ performance per week on a 5-point Likert-scale. The average score per week is collapsed per competency domain, per method and per student. The X‑axis represents 124 weeks of clinical training. The error bars represent the standard error (SE)
Fig. 3
Fig. 3
Average competency domain score (µ, se) within student across competency domain discretized by four weeks. Y‑axis represents the average score of students’ performance per competency domain on a 5-point Likert-scale. The average score is collapsed per method and per student. The error bars represent the standard error (SE)

Comment in

Similar articles

Cited by

References

    1. Carraccio C, Englander R. From Flexner to competencies: reflections on a decade and the journey ahead. Acad Med. 2013;88:1067–1073. doi: 10.1097/ACM.0b013e318299396f. - DOI - PubMed
    1. Epstein RM. Assessment in medical education. N Eng J Med. 2007;356:387–396. doi: 10.1056/NEJMra054784. - DOI - PubMed
    1. Van Der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ. 2005;39:309–317. doi: 10.1111/j.1365-2929.2005.02094.x. - DOI - PubMed
    1. Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34:205–214. doi: 10.3109/0142159X.2012.652239. - DOI - PubMed
    1. Bok HGJ. Competency-based veterinary education: an integrative approach to learning and assessment in the clinical workplace. Perspect Med Educ. 2015;4:86–89. doi: 10.1007/s40037-015-0172-1. - DOI - PMC - PubMed

MeSH terms