Medical students create multiple-choice questions for learning in pathology education: a pilot study

BMC Med Educ. 2018 Aug 22;18(1):201. doi: 10.1186/s12909-018-1312-1.

Abstract

Background: Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer.

Methods: Students in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students.

Results: Students were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise.

Conclusions: Our study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other's MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students.

Keywords: Assessment for learning; Bloom’s taxonomy; Medical students; Multiple-choice questions; Peer-instruction; PeerWise; Student-generated MCQ.

MeSH terms

  • Attitude
  • Education, Medical, Undergraduate*
  • Educational Measurement / methods*
  • Humans
  • Learning
  • New Zealand
  • Pathology / education*
  • Pilot Projects
  • Self-Evaluation Programs
  • Students, Medical*