Multi-method laboratory user evaluation of an actionable clinical performance information system: Implications for usability and patient safety

J Biomed Inform. 2018 Jan:77:62-80. doi: 10.1016/j.jbi.2017.11.008. Epub 2017 Nov 13.

Abstract

Introduction: Electronic audit and feedback (e-A&F) systems are used worldwide for care quality improvement. They measure health professionals' performance against clinical guidelines, and some systems suggest improvement actions. However, little is known about optimal interface designs for e-A&F, in particular how to present suggested actions for improvement. We developed a novel theory-informed system for primary care (the Performance Improvement plaN GeneratoR; PINGR) that covers the four principal interface components: clinical performance summaries; patient lists; detailed patient-level information; and suggested actions. As far as we are aware, this is the first report of an e-A&F system with all four interface components.

Objectives: (1) Use a combination of quantitative and qualitative methods to evaluate the usability of PINGR with target end-users; (2) refine existing design recommendations for e-A&F systems; (3) determine the implications of these recommendations for patient safety.

Methods: We recruited seven primary care physicians to perform seven tasks with PINGR, during which we measured on-screen behaviour and eye movements. Participants subsequently completed usability questionnaires, and were interviewed in-depth. Data were integrated to: gain a more complete understanding of usability issues; enhance and explain each other's findings; and triangulate results to increase validity.

Results: Participants committed a median of 10 errors (range 8-21) when using PINGR's interface, and completed a median of five out of seven tasks (range 4-7). Errors violated six usability heuristics: clear response options; perceptual grouping and data relationships; representational formats; unambiguous description; visually distinct screens for confusable items; and workflow integration. Eye movement analysis revealed the integration of components largely supported effective user workflow, although the modular design of clinical performance summaries unnecessarily increased cognitive load. Interviews and questionnaires revealed PINGR is user-friendly, and that improved information prioritisation could further promote useful user action.

Conclusions: Comparing our results with the wider usability literature we refine a previously published set of interface design recommendations for e-A&F. The implications for patient safety are significant regarding: user engagement; actionability; and information prioritisation. Our results also support adopting multi-method approaches in usability studies to maximise issue discovery and the credibility of findings.

Keywords: Clinical audit; Clinical decision support; Clinical governance; Clinical quality improvement; Clinical quality management; Medical audit; User interface design.

Publication types

  • Evaluation Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Decision Support Systems, Clinical / instrumentation*
  • Eye Movement Measurements / psychology
  • Feasibility Studies
  • Humans
  • Medical Records Systems, Computerized
  • Patient Care Management / methods
  • Patient Safety*
  • Primary Health Care
  • Quality Improvement
  • Software Design
  • Task Performance and Analysis
  • User-Computer Interface*