Purpose: Fostering ability to organize and use medical knowledge to guide data collection, make diagnostic decisions, and defend those decisions is at the heart of medical training. However, these abilities are not systematically examined prior to graduation. This study examined diagnostic justification (DXJ) ability of medical students shortly before graduation.
Method: All senior medical students in the Classes of 2011 (n = 67) and 2012 (n = 70) at Southern Illinois University were required to take and pass a 14-case, standardized patient examination prior to graduation. For nine cases, students were required to write a free-text response indicating how they used patient data to move from their differential to their final diagnosis. Two physicians graded each DXJ response. DXJ scores were compared with traditional standardized patient examination (SCCX) scores.
Results: The average intraclass correlation between raters' rankings of DXJ responses was 0.75 and 0.64 for the Classes of 2011 and 2012, respectively. Student DXJ scores were consistent across the nine cases. Using SCCX and DXJ scores led to the same pass-fail decision in a majority of cases. However, there were many cases where discrepancies occurred. In a majority of those cases, students would fail using the DXJ score but pass using the SCCX score. Common DXJ errors are described.
Conclusions: Commonly used standardized patient examination component scores (history/physical examination checklist score, findings, differential diagnosis, diagnosis) are not direct, comprehensive measures of DXJ ability. Critical deficiencies in DXJ abilities may thus go undiscovered.