Scoring the objective structured clinical examination using a microcomputer

Med Educ. 1989 Jul;23(4):376-80. doi: 10.1111/j.1365-2923.1989.tb01563.x.

Abstract

The objective structured clinical examination (OSCE) is being used increasingly to assess students' clinical competence in a variety of controlled settings. The OSCE consists of multiple stations composed of a variety of clinically relevant problems (e.g. examining simulated patients, diagnosing X-rays, etc.) Generally, three types of performance data are collected: answers to multiple choice or true/false questions, written short answers, and performance check-lists completed by observers. In most OSCEs these student performance measures are scored by hand. This is time-consuming, increases the probability of mistakes and reduces the amount of data available for analysis. This paper describes a method of computer scoring OSCEs with over 100 students using statistical and test-scoring software regularly used for multiple choice examinations. During the examination, students, markers and raters code answers and performance data directly on optical mark-sheets which are read into the computer using an optical mark reader. The resultant computer data can be efficiently scored and rescored, grouped into different types of subscales, weighted to reflect questions' relative importance, and easily printed in a variety of report formats.

MeSH terms

  • Clinical Competence*
  • Education, Medical, Undergraduate*
  • Microcomputers*