Protocol for a mixed-methods evaluation of a massive open online course on real world evidence

BMJ Open. 2018 Aug 13;8(8):e025188. doi: 10.1136/bmjopen-2018-025188.


Introduction: Increasing number of Massive Open Online Courses (MOOCs) are being used to train learners at scale in various healthcare-related skills. However, many challenges in course delivery require further understanding, for example, factors exploring the reasons for high MOOC dropout rates, recorded low social interaction between learners and the lack of understanding of the impact of a course facilitators' presence in course engagement. There is a need to generate further evidence to explore these detriments to MOOC course delivery to enable enhanced course learning design. The proposed mixed-methods evaluation of the MOOC was determined based on the MOOC's aims and objectives and the methodological approaches used to evaluate this type of a course. The MOOC evaluation will help appraise the effectiveness of the MOOC in delivering its intended objectives. This protocol aims to describe the design of a study evaluating learners knowledge, skills and attitudes in a MOOCs about data science for healthcare.

Methods and analysis: Study participants will be recruited from learners who have registered for the MOOC. On registration, learners will be given an opportunity to opt into the study and complete informed consent. Following completion of the course, study participants will be contacted to complete semistructured interviews. Interviews will be transcribed and coded using thematic analysis, with data analysed using two evaluation models: (1) the reach, effectiveness, adoption, implementation, maintenance framework and the (2) Kirkpatrick model drawing data from pre and post-course surveys and post-MOOC semi-structured interviews. The primary goal of the evaluation is to appraise participants' knowledge, skills and attitude after taking the MOOC.

Ethics and dissemination: Ethics approval for this study was obtained from Imperial College London through the Education Ethics Review Process (EERP) (EERP1617-030). A summary of the research findings will be reported through a peer-reviewed journal and will be presented at an international conference.

Keywords: continuing education; data science; information science; massive open online course; real world data; real world evidence.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Education, Distance / methods*
  • Education, Professional / methods*
  • Educational Measurement
  • Humans
  • Program Evaluation
  • Research Design