Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2013 Jun 19;4(2):276-92.
doi: 10.4338/ACI-2012-09-RA-0034. Print 2013.

Usability characteristics of self-administered computer-assisted interviewing in the emergency department: factors affecting ease of use, efficiency, and entry error

Affiliations

Usability characteristics of self-administered computer-assisted interviewing in the emergency department: factors affecting ease of use, efficiency, and entry error

D B Herrick et al. Appl Clin Inform. .

Abstract

Objective: Self-administered computer-assisted interviewing (SACAI) gathers accurate information from patients and could facilitate Emergency Department (ED) diagnosis. As part of an ongoing research effort whose long-range goal is to develop automated medical interviewing for diagnostic decision support, we explored usability attributes of SACAI in the ED.

Methods: Cross-sectional study at two urban, academic EDs. Convenience sample recruited daily over six weeks. Adult, non-level I trauma patients were eligible. We collected data on ease of use (self-reported difficulty, researcher documented need for help), efficiency (mean time-per-click on a standardized interview segment), and error (self-report age mismatched with age derived from electronic health records) when using SACAI on three different instruments: Elo TouchSystems ESY15A2 (finger touch), Toshiba M200 (with digitizer pen), and Motion C5 (with digitizer pen). We calculated descriptive statistics and used regression analysis to evaluate the impact of patient and computer factors on time-per-click.

Results: 841 participants completed all SACAI questions. Few (<1%) thought using the touch computer to ascertain medical information was difficult. Most (86%) required no assistance. Participants needing help were older (54 ± 19 vs. 40 ± 15 years, p<0.001) and more often lacked internet at home (13.4% vs. 7.3%, p = 0.004). On multivariate analysis, female sex (p<0.001), White (p<0.001) and other (p = 0.05) race (vs. Black race), younger age (p<0.001), internet access at home (p<0.001), high school graduation (p = 0.04), and touch screen entry (vs. digitizer pen) (p = 0.01) were independent predictors of decreased time-per-click. Participant misclick errors were infrequent, but, in our sample, occurred only during interviews using a digitizer pen rather than a finger touch-screen interface (1.9% vs. 0%, p = 0.09).

Discussion: Our results support the facility of interactions between ED patients and SACAI. Demographic factors associated with need for assistance or slower interviews could serve as important triggers to offering human support for SACAI interviews during implementation.

Conclusion: Understanding human-computer interactions in real-world clinical settings is essential to implementing automated interviewing as means to a larger long-term goal of enhancing clinical care, diagnostic accuracy, and patient safety.

Keywords: Medical history taking; clinical decision-support systems; computer assisted diagnosis; point of care systems; triage.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
Self- and observer-reported level of difficulty for the SACAI interview. Self-report answers about perceived difficulty are arrayed as vertical bars from “easy” to “very hard”. Black bar segments represent those who needed minimal or no help from research assistants (n=767); gray bar segments represent those who needed at least moderate help from research assistants (n = 38).* (*Total n = 805 reflects missing data on 36 subjects where interviews were terminated before research assistants were able to report whether the participant required assistance.)
Fig. 2
Fig. 2
Research assistant-reported reasons for patients’ requiring any assistance (n = 38).
Fig. 3
Fig. 3
Prevalence of three targeted symptoms by randomly-assigned question and answer format (‘yes/no’ question versus 4, 6, 9, or 12 on-screen choices). Patients were asked review of systems questions about the presence of “runny nose,” “sore throat,” or “earache” presented in one of five possible on-screen formats, assigned at random. Shown are the proportions reporting the presence of each of these symptoms arrayed by the number of response options presented on the screen. Patients were more likely to report each of the three symptoms when presented in binary (yes-no) format.

Similar articles

Cited by

References

    1. Newman-Toker DE, Pronovost PJ.Diagnostic errors the next frontier for patient safety. JAMA 2009; 301(10): 1060-1062 - PubMed
    1. Schiff GD, Hasan O, Kim S, Abrams R, Cosby K, Lambert BL, et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med 2009; 169(20): 1881-1887 - PubMed
    1. Kovacs G, Croskerry P.Clinical decision making: an emergency medicine perspective. AcadEmergMed 1999; 6(9): 947-952 - PubMed
    1. Schwartz LR, Overton DT.Emergency department complaints: a one-year analysis. Ann Emerg Med 1987; 16(8): 857-861 - PubMed
    1. Trautlein JJ, Lambert RL, Miller J.Malpractice in the emergency department--review of 200 cases. Ann Emerg Med 1984; 13(9 Pt 1): 709-711 - PubMed

Publication types