Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography

J Am Coll Cardiol. 2006 May 2;47(9):1796-802. doi: 10.1016/j.jacc.2005.12.053. Epub 2006 Apr 17.

Abstract

Objectives: Improvement in performance as measured by metric-based procedural errors must be demonstrated if virtual reality (VR) simulation is to be used as a valid means of proficiency assessment and improvement in procedural-based medical skills.

Background: The Food and Drug Administration requires completion of VR simulation training for physicians learning to perform carotid stenting.

Methods: Interventional cardiologists (n = 20) participating in the Emory NeuroAnatomy Carotid Training program underwent an instructional course on carotid angiography and then performed five serial simulated carotid angiograms on the Vascular Interventional System Trainer (VIST) VR simulator (Mentice AB, Gothenburg, Sweden). Of the subjects, 90% completed the full assessment. Procedure time (PT), fluoroscopy time (FT), contrast volume, and composite catheter handling errors (CE) were recorded by the simulator.

Results: An improvement was noted in PT, contrast volume, FT, and CE when comparing the subjects' first and last simulations (all p < 0.05). The internal consistency of the VIST VR simulator as assessed with standardized coefficient alpha was high (range 0.81 to 0.93), except for FT (alpha = 0.36). Test-retest reliability was high for CE (r = 0.9, p = 0.0001).

Conclusions: A learning curve with improved performance was demonstrated on the VIST simulator. This study represents the largest collection of such data to date in carotid VR simulation and is the first report to establish the internal consistency of the VIST simulator and its test-retest reliability across several metrics. These metrics are fundamental benchmarks in the validation of any measurement device. Composite catheter handling errors represent measurable dynamic metrics with high test-retest reliability that are required for the high-stakes assessment of procedural skills.

MeSH terms

  • Cardiology / education*
  • Carotid Arteries / diagnostic imaging*
  • Catheterization
  • Clinical Competence
  • Computer Simulation*
  • Contrast Media
  • Education, Medical, Continuing*
  • Fluoroscopy
  • Humans
  • Learning
  • Middle Aged
  • Radiography, Interventional*
  • Stents
  • User-Computer Interface*

Substances

  • Contrast Media