Background: Cardiac examination is an essential aspect of the physical examination. Previous studies have shown poor diagnostic accuracy, but most used audio recordings, precluding correlation with visible observations. The training spectrum from medical students (MSs) to faculty has not been tested, to our knowledge.
Methods: A validated 50-question, computer-based test was used to assess 4 aspects of cardiac examination competency: (1) cardiac physiology knowledge, (2) auditory skills, (3) visual skills, and (4) integration of auditory and visual skills using computer graphic animations and virtual patient examinations (actual patients filmed at the bedside). We tested 860 participants: 318 MSs, 289 residents (225 internal medicine and 64 family medicine), 85 cardiology fellows, 131 physicians (50 full-time faculty, 12 volunteer clinical faculty, and 69 private practitioners), and 37 others.
Results: Mean scores improved from MS1-2 to MS3-4 (P = .003) but did not improve or differ significantly among MS3, MS4, internal medicine residents, family medicine residents, full-time faculty, volunteer clinical faculty, and private practitioners. Only cardiology fellows tested significantly better (P<.001), and they were the best in all 4 subcategories of competency, whereas MS1-2 were the worst in the auditory and visual subcategories. Participants demonstrated low specificity for systolic murmurs (0.35) and low sensitivity for diastolic murmurs (0.49).
Conclusions: Cardiac examination skills do not improve after MS3 and may decline after years in practice, which has important implications for medical decision making, patient safety, cost-effective care, and continuing medical education. Improvement in cardiac examination competency will require training in simultaneous audio and visual examination in faculty and trainees.