Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Mar 1;2(3):e191514.
doi: 10.1001/jamanetworkopen.2019.1514.

Comparison of a Prototype for Indications-Based Prescribing With 2 Commercial Prescribing Systems

Affiliations

Comparison of a Prototype for Indications-Based Prescribing With 2 Commercial Prescribing Systems

Pamela M Garabedian et al. JAMA Netw Open. .

Abstract

Importance: The indication (reason for use) for a medication is rarely included on prescriptions despite repeated recommendations to do so. One barrier has been the way existing electronic prescribing systems have been designed.

Objective: To evaluate, in comparison with the prescribing modules of 2 leading electronic health record prescribing systems, the efficiency, error rate, and satisfaction with a new computerized provider order entry prototype for the outpatient setting that allows clinicians to initiate prescribing using the indication.

Design, setting, and participants: This quality improvement study used usability tests requiring internal medicine physicians, residents, and physician assistants to enter prescriptions electronically, including indication, for 8 clinical scenarios. The tool order assignments were randomized and prescribers were asked to use the prototype for 4 of the scenarios and their usual system for the other 4. Time on task, number of clicks, and order details were captured. User satisfaction was measured using posttask ratings and a validated system usability scale. The study participants practiced in 2 health systems' outpatient practices. Usability tests were conducted between April and October of 2017.

Main outcomes and measures: Usability (efficiency, error rate, and satisfaction) of indications-based computerized provider order entry prototype vs the electronic prescribing interface of 2 electronic health record vendors.

Results: Thirty-two participants (17 attending physicians, 13 residents, and 2 physician assistants) used the prototype to complete 256 usability test scenarios. The mean (SD) time on task was 1.78 (1.17) minutes. For the 20 participants who used vendor 1's system, it took a mean (SD) of 3.37 (1.90) minutes to complete a prescription, and for the 12 participants using vendor 2's system, it took a mean (SD) of 2.93 (1.52) minutes. Across all scenarios, when comparing number of clicks, for those participants using the prototype and vendor 1, there was a statistically significant difference from the mean (SD) number of clicks needed (18.39 [12.62] vs 46.50 [27.29]; difference, 28.11; 95% CI, 21.47-34.75; P < .001). For those using the prototype and vendor 2, there was also a statistically significant difference in number of clicks (20.10 [11.52] vs 38.25 [19.77]; difference, 18.14; 95% CI, 11.59-24.70; P < .001). A blinded review of the order details revealed medication errors (eg, drug-allergy interactions) in 38 of 128 prescribing sessions using a vendor system vs 7 of 128 with the prototype.

Conclusions and relevance: Reengineering prescribing to start with the drug indication allowed indications to be captured in an easy and useful way, which may be associated with saved time and effort, reduced medication errors, and increased clinician satisfaction.

PubMed Disclaimer

Conflict of interest statement

Conflict of Interest Disclosures: Ms Garabedian reported grants from Agency for Healthcare Research and Quality (AHRQ) during the conduct of the study. Ms Volk reported grants from AHRQ during the conduct of the study and funding to review Medaware adverse drug reaction screening software outside the conduct of the submitted work. Dr Salazar reported grants from AHRQ during the conduct of the study. Ms Forsythe reported grants from AHRQ during the conduct of the study. Dr Galanter reported grants from Brigham and Women's Hospital and personal fees from Brigham and Women's Hospital during the conduct of the study. Dr Bates reported grants from AHRQ during the conduct of the study; consulting for EarlySense, which makes patient safety monitoring systems; receiving cash compensation from CDI (Negev), Ltd, which is a nonprofit incubator for health information technology startups; equity from ValeraHealth, which makes software to help patients with chronic diseases; equity from Clew, which makes software to support clinical decision making in intensive care; and equity from MDClone, which takes clinical data and produces deidentified versions of it. Dr Bates' financial interests have been reviewed by Brigham and Women's Hospital and Partners HealthCare in accordance with their institutional policies. Dr Schiff reported grants from AHRQ during the conduct of the study; and funding to review Medaware adverse drug reaction screening software outside the conduct of the submitted work. No other disclosures were reported.

Figures

Figure 1.
Figure 1.. Screenshot From the Indications-Based Prescribing Prototype of the Gonorrhea Case Scenario
A prescriber enters the indication in the search bar (or selects it from a preexisting problem list—not shown on the screen). The prototype then suggests drugs of choice with alternatives and drugs that are not recommended based on patient factors (eg, allergies), insurance formulary requirements, and evidence-based guidelines. After a drug is selected, the order details screen appears with most fields prepopulated with default options for dosing and frequency based on the indication and patient factors. Completing the order details adds the ordered drugs to the RxCart for final confirmation (next screen not shown here).
Figure 2.
Figure 2.. Usability Test Results of Time on Task and Clicks
Results of the usability testing on the prototype (32 participants), vendor 1 (20 participants), and vendor 2 (12 participants) are shown for time on task and number of clicks. Although the prototype measure shown is that for all participants, for statistical tests the participants who used vendor 1 were compared with their performance on the prototype, as also done with vendor 2. H pylori indicates Helicobacter pylori; error bars, 95% confidence intervals. aP < .05. bP < .01.
Figure 3.
Figure 3.. Usability Test Results of Access to Outside Reference Source
The percentage of participants who accessed an outside reference source during the ordering tasks is shown for each diagnosis for the prototype, vendor 1, and vendor 2. H pylori indicates Helicobacter pylori.

Similar articles

Cited by

References

    1. The Joint Commission. National patient safety goals. http://www.jointcommission.org/assets/1/18/NPSG_Chapter_Jan2013_HAP.pdf..... Published 2013. Accessed October 1, 2018.
    1. National Council for Prescription Drug Programs SCRIPT Implementation Recommendations Scottsdale, AZ: National Council for Prescription Drug Programs; 2018. http://www.ncpdp.org/NCPDP/media/pdf/SCRIPT-Implementation-Recommendatio.... Accessed October 1, 2018.
    1. National Association of Boards of Pharmacy Medication indication on the prescription (resolution No. 100-7-04). https://nabp.pharmacy/medication-indication-on-the-prescription-resoluti.... Published 2004. Accessed October 1, 2018.
    1. National Coordinating Council for Medication Error Reporting and Prevention Recommendations to enhance accuracy of prescription/medication order writing. https://www.nccmerp.org/recommendations-enhance-accuracy-prescription-wr.... Updated July 18, 2014. Accessed October 1, 2018.
    1. US Pharmacopeial Convention USP–NF general chapter <17> prescription container labeling. http://www.usp.org/health-quality-safety/usp-nf-general-chapter-prescrip.... Published 2012. Accessed October 1, 2018.

Publication types