Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study

BMJ Open. 2019 Sep 26;9(9):e032550. doi: 10.1136/bmjopen-2019-032550.


Objectives: The study aimed to compare candidate performance between traditional best-of-five single-best-answer (SBA) questions and very-short-answer (VSA) questions, in which candidates must generate their own answers of between one and five words. The primary objective was to determine if the mean positive cue rate for SBAs exceeded the null hypothesis guessing rate of 20%.

Design: This was a cross-sectional study undertaken in 2018.

Setting: 20 medical schools in the UK.

Participants: 1417 volunteer medical students preparing for their final undergraduate medicine examinations (total eligible population across all UK medical schools approximately 7500).

Interventions: Students completed a 50-question VSA test, followed immediately by the same test in SBA format, using a novel digital exam delivery platform which also facilitated rapid marking of VSAs.

Main outcome measures: The main outcome measure was the mean positive cue rate across SBAs: the percentage of students getting the SBA format of the question correct after getting the VSA format incorrect. Internal consistency, item discrimination and the pass rate using Cohen standard setting for VSAs and SBAs were also evaluated, and a cost analysis in terms of marking the VSA was performed.

Results: The study was completed by 1417 students. Mean student scores were 21 percentage points higher for SBAs. The mean positive cue rate was 42.7% (95% CI 36.8% to 48.6%), one-sample t-test against ≤20%: t=7.53, p<0.001. Internal consistency was higher for VSAs than SBAs and the median item discrimination equivalent. The estimated marking cost was £2655 ($3500), with 24.5 hours of clinician time required (1.25 s per student per question).

Conclusions: SBA questions can give a false impression of students' competence. VSAs appear to have greater authenticity and can provide useful information regarding students' cognitive errors, helping to improve learning as well as assessment. Electronic delivery and marking of VSAs is feasible and cost-effective.

Keywords: Assessment; MEDICAL EDUCATION & TRAINING; applied medical knowledge.

Publication types

  • Comparative Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Academic Performance
  • Clinical Competence*
  • Cross-Sectional Studies
  • Decision Making
  • Education, Medical, Undergraduate / methods*
  • Educational Measurement / methods*
  • Humans
  • Knowledge
  • Learning*
  • Schools, Medical*
  • Students, Medical*
  • Surveys and Questionnaires
  • United Kingdom