Background: Clinical skills examinations (CSEs) are used frequently in medical education. Standard setting for CSEs may employ different methods with or without prior performance data.
Method: An expert panel provided item-based (Angoff) and group-based (Hofstee) judgments about two central venous catheter insertion performance checklists on three occasions. Judges did not receive baseline performance data on the first occasion but did on occasions two and three. Judges' ratings were used to calculate a minimum passing standard (MPS) for the CSE. Interrater reliabilities and test-retest reliability (stability) were calculated. Passing standards are compared using performance data from a pilot study.
Results: Both methods produced reliable and stable data. Baseline data influenced the judges' decisions. Use of the Angoff method alone yielded lenient MPSs, whereas the Hofstee method alone yielded stringent MPSs.
Conclusions: Standard setting is a critical component of CSEs. Baseline data influence judges' decisions. Averaging Angoff and Hofstee outcomes produced the optimal MPS.