Integrating usability testing and think-aloud protocol analysis with "near-live" clinical simulations in evaluating clinical decision support
- PMID: 22456088
- DOI: 10.1016/j.ijmedinf.2012.02.009
Integrating usability testing and think-aloud protocol analysis with "near-live" clinical simulations in evaluating clinical decision support
Abstract
Purpose: Usability evaluations can improve the usability and workflow integration of clinical decision support (CDS). Traditional usability testing using scripted scenarios with think-aloud protocol analysis provide a useful but incomplete assessment of how new CDS tools interact with users and clinical workflow. "Near-live" clinical simulations are a newer usability evaluation tool that more closely mimics clinical workflow and that allows for a complementary evaluation of CDS usability as well as impact on workflow.
Methods: This study employed two phases of testing a new CDS tool that embedded clinical prediction rules (an evidence-based medicine tool) into primary care workflow within a commercial electronic health record. Phase I applied usability testing involving "think-aloud" protocol analysis of 8 primary care providers encountering several scripted clinical scenarios. Phase II used "near-live" clinical simulations of 8 providers interacting with video clips of standardized trained patient actors enacting the clinical scenario. In both phases, all sessions were audiotaped and had screen-capture software activated for onscreen recordings. Transcripts were coded using qualitative analysis methods.
Results: In Phase I, the impact of the CDS on navigation and workflow were associated with the largest volume of negative comments (accounting for over 90% of user raised issues) while the overall usability and the content of the CDS were associated with the most positive comments. However, usability had a positive-to-negative comment ratio of only 0.93 reflecting mixed perceptions about the usability of the CDS. In Phase II, the duration of encounters with simulated patients was approximately 12 min with 71% of the clinical prediction rules being activated after half of the visit had already elapsed. Upon activation, providers accepted the CDS tool pathway 82% of times offered and completed all of its elements in 53% of all simulation cases. Only 12.2% of encounter time was spent using the CDS tool. Two predominant clinical workflows, accounting for 75% of all cases simulations, were identified that characterized the sequence of provider interactions with the CDS. These workflows demonstrated a significant variation in temporal sequence of potential activation of the CDS.
Conclusions: This study successfully combined "think-aloud" protocol analysis with "near-live" clinical simulations in a usability evaluation of a new primary care CDS tool. Each phase of the study provided complementary observations on problems with the new onscreen tool and was used to refine both its usability and workflow integration. Synergistic use of "think-aloud" protocol analysis and "near-live" clinical simulations provide a robust assessment of how CDS tools would interact in live clinical environments and allows for enhanced early redesign to augment clinician utilization. The findings suggest the importance of using complementary testing methods before releasing CDS for live use.
Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Similar articles
-
Usability testing of Avoiding Diabetes Thru Action Plan Targeting (ADAPT) decision support for integrating care-based counseling of pre-diabetes in an electronic health record.Int J Med Inform. 2014 Sep;83(9):636-47. doi: 10.1016/j.ijmedinf.2014.05.002. Epub 2014 May 23. Int J Med Inform. 2014. PMID: 24981988 Free PMC article.
-
"Think aloud" and "Near live" usability testing of two complex clinical decision support tools.Int J Med Inform. 2017 Oct;106:1-8. doi: 10.1016/j.ijmedinf.2017.06.003. Epub 2017 Jun 23. Int J Med Inform. 2017. PMID: 28870378 Free PMC article.
-
Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned.JMIR Hum Factors. 2015 Sep 10;2(2):e14. doi: 10.2196/humanfactors.4537. JMIR Hum Factors. 2015. PMID: 27025540 Free PMC article.
-
Struggling to bring clinical prediction rules to the point of care: missed opportunities to impact patient care.J Comp Eff Res. 2012 Sep;1(5):421-9. doi: 10.2217/cer.12.51. J Comp Eff Res. 2012. PMID: 24236419 Review.
-
Design of decision support interventions for medication prescribing.Int J Med Inform. 2013 Jun;82(6):492-503. doi: 10.1016/j.ijmedinf.2013.02.003. Epub 2013 Mar 13. Int J Med Inform. 2013. PMID: 23490305 Review.
Cited by
-
User-Centered Design and Usability of Voxe as a Pediatric Electronic Patient-Reported Outcome Measure Platform: Mixed Methods Evaluation Study.JMIR Hum Factors. 2024 Sep 19;11:e57984. doi: 10.2196/57984. JMIR Hum Factors. 2024. PMID: 39298749 Free PMC article.
-
Data-Driven Hypothesis Generation in Clinical Research: What We Learned from a Human Subject Study?Med Res Arch. 2024 Feb;12(2):10.18103/mra.v12i2.5132. doi: 10.18103/mra.v12i2.5132. Epub 2024 Feb 28. Med Res Arch. 2024. PMID: 39211055 Free PMC article.
-
Oncologists' Perceptions of a Digital Tool to Improve Cancer Survivors' Cardiovascular Health.ACI open. 2019 Jul;3(2):e78-e87. doi: 10.1055/s-0039-1696732. Epub 2019 Oct 3. ACI open. 2019. PMID: 39149692 Free PMC article.
-
Eye tracking insights into physician behaviour with safe and unsafe explainable AI recommendations.NPJ Digit Med. 2024 Aug 2;7(1):202. doi: 10.1038/s41746-024-01200-x. NPJ Digit Med. 2024. PMID: 39095449 Free PMC article.
-
The Impact of Expectation Management and Model Transparency on Radiologists' Trust and Utilization of AI Recommendations for Lung Nodule Assessment on Computed Tomography: Simulated Use Study.JMIR AI. 2024 Mar 13;3:e52211. doi: 10.2196/52211. JMIR AI. 2024. PMID: 38875574 Free PMC article.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
