Objectives: Observed Structured Clinical Exams (OSCEs) allow assessment of, and provide feedback to, medical students. Clinical examiners and standardised patients (SP) typically complete itemised checklists and global scoring scales, which have known shortcomings. In this study, we applied machine learning (ML) to label some communication skills and interview content information in OSCE transcripts and to compare several ML methodologies by performance and transferability.
Methods: One-hundred and twenty-one transcripts of two OSCE scenarios were manually annotated per utterance across 19 communication skills and content areas. Utterances were converted to two types of numeric sentence vector representations and were paired with three types of ML algorithms. First, ML models (MLMs) were evaluated using a five K-fold cross-validation technique on all transcripts in one scenario to generate precision and recall, and their harmonic mean, F1 scores. Second, ML models were trained on all 101 transcripts from scenario 1 and tested for transferability on 20 scenario 2 transcripts.
Results: Performance testing in the K-fold cross-validation demonstrated relatively high mean F1 scores: median 0.87 and range 0.53-0.98 across all 19 labels. Transferability testing demonstrated success: F1 median 0.76 and range 0.46-0.97. The combination of a bi-directional long short-term memory neural network (biLSTM) algorithm with GenSen numeric sentence vector representations was associated with greater F1 scores across both performance and transferability (P < .005).
Conclusions: We report the first application of ML in the context of student-SP OSCEs. We demonstrated that several MLMs automatically labelled OSCE transcripts for a range of interview content and some clinical communications skills. Some MLMs achieved greater performance and transferability. Optimised MLMs could provide automated and accurate assessment of OSCEs with potential to track student progress and identify areas for further practice.
© 2020 John Wiley & Sons Ltd and The Association for the Study of Medical Education.