Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68...

69
Allscripts ePrescribe v17.1 Meaningful Use 2 Summative Usability Test Report A User-Centered Design (UCD) Activity www.allscripts.com Copyright © 2013 Allscripts Healthcare Solutions, Inc.

Transcript of Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68...

Page 1: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Allscripts ePrescribe v17.1

Meaningful Use 2

Summative Usability Test Report

A User-Centered Design (UCD) Activity

www.allscripts.com Copyright © 2013 Allscripts Healthcare Solutions, Inc.

Page 2: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 2 of 68

Copyright Notice

Copyright © 2013 Allscripts Healthcare Solutions, Inc.

Page 3: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 3 of 68

Contents

1 Executive Summary ......................................................................... 9

2 Introduction ................................................................................. 15

3 Method ....................................................................................... 16

3.1 Participants ................................................................................. 16

3.2 Study Design ............................................................................... 19

3.3 Tasks ........................................................................................ 19

3.4 Procedures ................................................................................. 20

3.5 Test Location ............................................................................... 21

3.6 Test Environment .......................................................................... 21

3.7 Test Forms and Tools .................................................................... 22

3.8 Participant Instructions ................................................................... 22

3.9 Usability Metrics ........................................................................... 23

3.10 Data Scoring ............................................................................. 23

4 Clinical Test Results ...................................................................... 25

5 Chapter §170.314(a)(1) Computerized Provider Order Entry (CPOE) Results25

Although ePrescribe is not certifying on §170.314(a)(1) Computerized Provider Order

Entry (CPOE), ordering and changing a medication is an essential task used with the

ePrescribe system. So these tasks were included in the study. .......................... 25

5.1 Task Mapping .............................................................................. 25

5.2 Task Participants and Instruction ....................................................... 27

5.3 Data Analysis and Reporting ............................................................ 28

5.4 Discussion of the Findings ............................................................... 28

RISK ANALYSIS .................................................................................. 29

EFFECTIVENESS ................................................................................ 29

EFFICIENCY ...................................................................................... 29

SATISFACTION ................................................................................... 29

MAJOR FINDINGS ............................................................................... 29

AREAS FOR IMPROVEMENT ................................................................. 30

6 Chapter §170.314(a)(2) Drug-Drug, Drug-Allergy Checks – Interventions Results

31

6.1 Task Mapping .............................................................................. 31

6.2 Task Participants and Instruction ....................................................... 32

6.3 Data Analysis and Reporting ............................................................ 32

6.4 Discussion of the Findings ............................................................... 33

RISK ANALYSIS .................................................................................. 33

EFFECTIVENESS ................................................................................ 34

EFFICIENCY ...................................................................................... 35

SATISFACTION ................................................................................... 35

MAJOR FINDINGS ............................................................................... 35

Page 4: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 4 of 68

AREAS FOR IMPROVEMENT ................................................................. 36

7 Chapter §170.314(a)(6) Medication List Results .................................... 37

7.1 Task Mapping .............................................................................. 37

7.2 Task Participants and Instruction ....................................................... 38

7.3 Data Analysis and Reporting ............................................................ 38

7.4 Discussion of the Findings ............................................................... 40

RISK ANALYSIS .................................................................................. 41

EFFECTIVENESS ................................................................................ 41

EFFICIENCY ...................................................................................... 42

SATISFACTION ................................................................................... 42

MAJOR FINDINGS ............................................................................... 42

AREAS FOR IMPROVEMENT ................................................................. 43

8 Chapter §170 314(a)(7) Medication Allergy List Results .......................... 44

8.1 Task Mapping .............................................................................. 44

8.2 Task Participants and Instruction ....................................................... 44

8.3 Data Analysis and Reporting ............................................................ 45

8.4 Discussion of the Findings ............................................................... 46

RISK ANALYSIS .................................................................................. 46

EFFECTIVENESS ................................................................................ 47

EFFICIENCY ...................................................................................... 47

SATISFACTION ................................................................................... 47

MAJOR FINDINGS ............................................................................... 47

AREAS FOR IMPROVEMENT ................................................................. 48

9 Chapter §170 314(b)(3) Electronic Prescribing ..................................... 49

9.1 Task Mapping .............................................................................. 49

9.2 Task Participants and Instruction ....................................................... 49

9.3 Data Analysis and Reporting ............................................................ 50

9.4 Discussion of the Findings ............................................................... 50

RISK ANALYSIS .................................................................................. 51

EFFECTIVENESS ................................................................................ 51

EFFICIENCY ...................................................................................... 51

SATISFACTION ................................................................................... 51

MAJOR FINDINGS ............................................................................... 52

AREAS FOR IMPROVEMENT ................................................................. 52

10 Configuration Test Results .............................................................. 53

11 Chapter §170.314(a)(2)(ii)(A) Drug-Drug Interaction Checks – Adjustments

Results ............................................................................................ 53

11.1 Task Mapping ........................................................................... 53

11.2 Task Participants and Instruction ..................................................... 54

11.3 Data Analysis and Reporting .......................................................... 54

11.4 Discussion of the Findings ............................................................ 55

RISK ANALYSIS .................................................................................. 55

Page 5: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 5 of 68

EFFECTIVENESS ................................................................................ 55

EFFICIENCY ...................................................................................... 55

SATISFACTION ................................................................................... 55

MAJOR FINDINGS ............................................................................... 56

AREAS FOR IMPROVEMENT ................................................................. 56

12 System Satisfaction ....................................................................... 57

12.1 About System Usability Scale (SUS) Scores ....................................... 57

12.2 Clinical System Satisfaction Results ................................................. 57

12.3 Configuration System Satisfaction Results ......................................... 57

13 Appendices .................................................................................. 58

13.1 Appendix 1: Recruiting Screener ..................................................... 59

13.2 Appendix 2: NDA and Informed Consent Form .................................... 63

13.3 Appendix 3: Moderator Guides ....................................................... 64

13.4 Appendix 4: Task Detail Memory Aide .............................................. 69

13.5 Appendix 5: System Usability Scale Questionnaire ............................... 70

13.6 Appendix 6: Incentive Receipt and Acknowledgment Form ...................... 71

Page 6: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 8 of 68

ePrescribe v17.1

Summative Usability Test Report

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test

Reports

Date of Usability Test: 8/15/2013 – 9/7/2013

Date of Report: 9/13/2013

Report Prepared By: User-View, Inc.

Jennifer Mauney and Janey Barnes,

Human Factors Specialist

User-View

[919.697.5329]

[[email protected]]

[1109 Holland Ridge Dr.

Raleigh, NC 27603]

Page 7: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 9 of 68

1 Executive Summary

A usability test of Allscripts ePrescribe v17.1 was conducted on 8/15/2013 –

9/7/2013 remotely, by User-View, Inc. The primary purpose of this summative

usability test is to provide objective evidence that the application including both

clinical (physician and nurse facing) and configuration (configuration specialist facing)

user interfaces can be used in a safe, efficient, and effective manner with regard to

four of the eight prioritized certification criteria:

• §170.314(a)(1) Computerized provider order entry – Not Applicable

• §170.314(a)(2) Drug-drug, drug-allergy interaction checks

• §170 314(a)(6) Medication list

• §170 314(a)(7) Medication allergy list

• §170.314(a)(8) Clinical decision support (CDS) – Not Applicable

• §170.314(b)(3) Electronic prescribing

• §170.314(b)(4) Clinical information reconciliation – Not Applicable

• §170.314(a)(16) Electronic medication administration record – Not

Applicable

Eleven (11) physicians, 10 nurses, and 13 configuration specialists matching the

target demographic criteria served as participants in the usability test. All

participants were current users of the ePrescribe system (either clinical or

configuration interfaces or both). Each participant performed simulated but

representative tasks specific to their user role.

This study collected performance data on two tasks typically conducted on the

ePrescribe system. Tasks and subtasks were created and mapped to the prioritized

Meaningful Use Certification Criteria. The two clinical tasks were:

1. Add, change and review the medication list, order and change a

medication, review and act upon drug-drug and drug-allergy warnings,

and electronically send the prescription to the pharmacy.

2. Add, change and review medication allergies.

This study also collected performance data on one configuration task. It was:

1. Locate and change the drug-drug/drug-allergy severity level

All usability testing sessions performed for this study were conducted remotely.

During the 45 minute (clinical) or 30 minute (configuration) or 1 hour (clinical and

configuration) usability tests, each participant was greeted by the moderator and

asked to fill out an invoice form, after which the moderator asked the participant to

review and sign an informed consent/release form (included in

Page 8: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 10 of 68

Appendix 2: NDA and Informed Consent Form); they were instructed that they could

withdraw at any time. Participants had prior experience with the ePrescribe system.

The moderator introduced the test and instructed participants to complete a series of

tasks (given one at a time) using the application. During the test, the moderator

timed each task and, along with the data logger, recorded user performance data on

paper and electronically. The moderator did not give the participant assistance in how

to complete the task unless the participant state that s/he was done with the tasks.

Participant screens and audio during the sessions were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Effectiveness

o Percentage of tasks successfully completed within the allotted time

without assistance (Pass)

o Percentage of tasks successfully completed with one assist from the

moderator (Pass with help)

o Percentage of task failures (Fail)

o Types of errors

• Efficiency

o Task Time

o Click Path Notes

o Types of errors

• Participant’s verbalizations

• System Satisfaction

o Participant’s satisfaction rating of the system

All participant data was de-identified so that no correspondence could be made from

the identity of the participant to the data collected. Following the conclusion of the

testing, participants were asked to complete a post-test questionnaire and were

compensated for their time. All clinical participants (physicians and nurses) were

compensated $100 for their time. All configuration participants were compensated

$75 for their time. All participants who took part in both the clinical and the

configuration part of the study were compensated $175 for their time.

Various recommended metrics, in accordance with the examples set forth in the NIST

Guide to the Processes Approach for Improving the Usability of Electronic Health

Records, were used to evaluate the usability of the application. Performance data for

the clinical tasks is summarized in

Table 1.

Page 9: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 11 of 68

Table 1: Clinical Test Summary

All Participants n % Pass % Pass with help

% Fail % Pass + Pass with help

Computerized Physician Order Entry (CPOE)

Order a New Medication 21 100% 0% 0% 100%

Change the Medication 21 95% 5% 0% 100%

Electronic Prescribing

Electronically send the prescription 21 100% 0% 0% 100%

Drug-Drug / Drug-Allergy / Drug-Dosage Interaction Alerts

Review and act upon drug-drug alerts in Medication List

21 71% 0% 29% 71%

Review and act upon drug-allergy alerts in CPOE

21 95% 0% 5% 95%

Medication List

Add to the Medication List (patient-reported medication)

21 57% 10% 33%* 67%*

Change the Medication List (1) 21 95% 0% 5% 95%

Change the Medication List (2) 20 90% 0% 10% 90%

Review the Medication List 21 76% 10% 14%** 86%**

Medication Allergy List

Change existing patient’s drug allergy

20 60% 20% 20% 80%

Add New drug allergy 20 80% 15% 5% 95%

Review Allergy List 20 75% 25% 0% 100%

* Failure of this task for five/seven participants was due to an artifact of testing.

** Failure of this task for two/three participants was due to an artifact of testing.

Eleven (11) physicians and ten (10) nurses completed the System Usability Scale

(SUS) questionnaire at the end of their session. The SUS is a reliable and valid

measure of system satisfaction. Sauro

Page 10: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 12 of 68

(http://www.measuringusability.com/sus.php accessed 3/14/2013) reports, the

average SUS score from 500 studies across various products e.g., websites, cell

phones, enterprise systems and across different industries is a 68. A SUS score

above a 68 is considered above average and anything below 68 is below average.

User-View encourages teams not to focus on the comparison to the cross

industry average SUS of 68 reported by Sauro. Instead, we encouraged teams

to use the SUS as a measure to compare their own usability improvement in the

application as changes are made.

The Clinical Module scored an average of 73.3 (SD=16.6). Nurses (n=10) rated

the system as 79 (SD=13.4), and physicians (n=10) rated the system as 68.2

(SD=18.1).

In addition to the performance data, the following qualitative observations were

made regarding the ePrescribe Clinical Module.

• Major Findings and Areas of Improvement

o Computer Physician Order Entry:

• No critical use errors were identified or observed as part of

this scenario.

• Participants were able to efficiently and effectively complete

the subtasks for the Computer Physician Order Entry.

o Drug-Drug and Drug-Allergy Interaction Checks

• As expected, clinical participants described frustration with

over-alerting in the usability test and in their own practice.

Many of the Meaningful Use 2 Safety Enhanced Design

criteria involve alerts and messages that interrupt the

workflow. As such, much of the alerting experienced during

the usability test session was an artifact of testing.

Configuration was deliberately set to show both low and

high severity alerts to the test participants. Depending on

the scenario, both critical and non-critical alerts were

appearing. The non-critical alerts were perceived as over-

alerting, whereas critical alerts were perceived as valuable

information. The Allscripts team is well aware of the

industry’s known issue with alert fatigue. The team would

like to call for and be involved with the development and

sharing of industry best practices and guidelines regarding

safety-enhanced design that impacts alerts and alert

fatigue.

• One alert caused participants confusion. Part of the

confusion could have been caused by the test workflow,

Page 11: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 13 of 68

which was later found to be an uncommon workflow for

many participants. This alert is designed to decrease the

number of warnings users can receive. However, some

participants had confusion concerning how this alert

worked. This could cause serious issues. In addition,

participants from the nurses’ user group who received one

or more warnings, which could follow this alert workflow,

weren’t always able to quickly scan and identify the

different warnings and diagnostic information within the

warnings. Allscripts continues to work closely with

customers to assure user-friendly designs and for these

issues will:

• Use language in the alert/warning that clarifies to the

users how the alert/warning works,

• Improve content layout to optimize for visual scanning

and information processing.

o Medication List:

• Adding a patient-reported medication to the patient’s

medication list is a feature that is not commonly used by

many of the participants. The unfamiliarity with this

subtask caused an increased failure rate and is considered

an artifact of testing.

• One usage error identified in this test was a participant’s

use of an incorrect button to try and add a patient reported

medication to the patients’ medication list. The label on

this button caused confusion. The team is investigating

how to best optimize the labeling of the button.

• Participants’ unfamiliarity with reviewing past medications

(historical medications that patients are no longer taking),

were most commonly associated with the inability to

complete this task. These failures would be considered an

artifact of testing since these participants do not commonly

do this type of task in their practice.

o Medication Allergy List:

• A few participants were unfamiliar with entering two or

more reactions for a medication allergy. One participant

who was unfamiliar with entering two allergic reactions for a

medication entered only one reaction: the weaker reaction

instead of the stronger reaction. This usage error is being

addressed by the team by developing designs that will allow

users to more clearly understand how to enter two or more

allergic reactions.

Page 12: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 14 of 68

• One participant entered reactions for the wrong medication

allergy. The consequences of this can be serious. The

current mitigation strategy is to clearly display the

medication name while choosing the allergic reactions. o Electronically sent the Prescription:

• Participants stated how easy this task was to complete.

• No critical use errors were identified or observed as part of

this scenario.

• Participants were able to efficiently and effectively complete

the subtask for electronically sending the prescription.

Table 2 provides a summary of the performance data collected on the Configuration

task.

Table 2: Configuration Task Summary

All Participants n %

Pass

%

Pass

with

help

%

Fail

%

Pass

+

Pass

with

help

Adjust Severity Levels of Drug

Interactions.

Locate and Change Drug-Drug

Severity Level 13 92% 8% 0% 100%

Based on SUS ratings from 13 configuration participants, the configuration area

scored an average of 73.5 (SD=16.2) in system satisfaction.

In addition to the performance data, the following qualitative observations were

made:

• Major findings

o No critical use errors were identified or observed as part of this

scenario.

o Participants were able to efficiently and effectively complete the

subtask for electronically sending the prescription.

Page 13: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 15 of 68

2 Introduction

The Allscripts system tested for this study was Allscripts ePrescribe v17.1. Designed to

present medical information to healthcare providers in ambulatory settings, the

application aims at supporting electronic prescribing, areas related to electronic

prescribing and documentation. Allscripts ePrescribe configuration area provides

configuration specialists access to back-end tools for site and department configuration.

The usability testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the current user

interface, and provide evidence of usability in the application. To this end, measures of

effectiveness, efficiency and user satisfaction, such as pass/fail rates, errors and error

types, task time, and System Usability Scale (SUS), were captured during the usability

testing.

Page 14: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 16 of 68

3 Method

3.1 Participants

Twenty one (21) clinical participants took part in the clinical usability test sessions.

Thirteen (13) configuration participants took part in the configuration test sessions.

Participants were recruited through mass emails and by word-of-mouth from other

participants. For the test purposes, end-user characteristics were identified and

translated into a participant screener. The screener was then used to verify that

potential participants met the necessary requirements to be in the study; the

screener is provided in Appendix 1: Recruiting Screener.

Participants in the clinical usability test sessions included physicians, nurses, and

medical assistants. Clinical participants had to be current users of the system and

treat patients as part of their current role. Providers with prescribing privileges were

scheduled as Physician end users. Nurses and Medical Assistants were scheduled as

Nurse end users. Configuration participants had to have experience performing

configuration tasks with the system as part of their current role and understand how

to change the severity levels of drug-drug interaction warnings. The participants for

the configuration tasks were made up of five (5) IT specialists, four (4) physicians,

and four (4) nurses. Configuration specialists who participated in usability test

sessions were not involved in the design or development of the configuration system

evaluated as part of this study.

Recruited participants had a mix of backgrounds and demographic characteristics.

Page 15: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 17 of 68

Table 3 and Table 4 list participants by characteristics, including job title and

professional activity. Participant names were replaced with Participant IDs so that an

individual’s data could not be associated with individual identities.

Page 16: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 18 of 68

Table 3: Clinical Participants’ Characteristics

P-ID Job Role User Group

CL01 MD/Family Medicine Physician

CL02 NCMA/Psychiatry Nurse

CL03 RN/Family Practice Nurse

CL04 MA/Psychiatry Nurse

CL05 NP/Family Practice Physician

CL06 MD/Internal Medicine Physician

CL07 MD/Psychiatry Physician

CL08 NP/Psychiatry Physician

CL09 MD/Internal Medicine Physician

CL10 MD/Internal Medicine Physician

CL11 MD/Immunology Nurse

CL12 NP//Psychiatry Physician

CL13 MA/Internal Medicine Nurse

CL14 MA/Rheumatology Nurse

CL15 LPN/Psychiatry Nurse

CL16 LVN/Psychiatry Nurse

CL17 MD/Psychiatry Physician

CL18 MA/Cardiology Nurse

CL19 LPN/Psychiatry Nurse

CL20 MD/Psychiatry Physician

CL21 RMA/Cardiology Nurse

Table 4: Configuration Participants’ Characteristics

P-ID Job Role User Group

Conf01 MD/System Administrator Configuration

Conf02 MA/System Administrator Configuration

Conf03 MD/System Administrator Configuration

Conf04 MD/System Administrator Configuration

Conf05 MD/Office Manager Configuration

Conf06 IT/ Systems Administrator Configuration

Conf07 IT/ Systems Analyst Configuration

Conf08 IT/ Systems Analyst Configuration

Conf09 IT/Clinical Applications Manager Configuration

Conf10 Physician/System Administrator Configuration

Conf11 MPH/System Administrator Configuration

Conf12 LVN/System Administrator Configuration

Conf13 IT/Quality Improvement Configuration

Page 17: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 19 of 68

Clinical participants were scheduled for 45 minute remote sessions. There were at

least 30 minutes in between each session for debrief by the moderator and data

logger and to reset systems to proper test conditions. Configuration participants were

scheduled for 30 minute remote sessions with at least 15 minutes in between each

session. Participants who participated in both, clinical and configuration sessions,

were scheduled for one (1) hour session. A spread sheet was used to keep track of

the participant schedule, and included each participant’s demographic characteristics.

3.2 Study Design

Overall, the objective of this test was to uncover areas where the application

performed well – that is, effectively, efficiently, and with satisfaction – and areas

where the application failed to meet the needs of the participants. The data from this

test may serve as a baseline for future tests with an updated version of the same

module and/or comparison with other comparable modules provided the same tasks

are used. In short, this testing serves as both a means to record or benchmark

current usability, but also to identify areas where improvements must be made.

During the usability test, participants interacted with the application. Each participant

was provided with the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and analysed for each

participant:

• Number of tasks successfully completed within the allotted time without

assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system.

Additional information about the various measures can be found Section 3.9 on

Usability Metrics.

3.3 Tasks

Tasks were designed based on the Meaningful Use Stage 2 certification criteria and

were based on their frequency of use and criticality of function. Tasks were

constructed that would be realistic and representative of the kinds of activities a user

might do with the ePrescribe system. As part of the task construction, tasks were

Page 18: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 20 of 68

prioritized in accordance with the risk associated with use errors. NISTIR 7804: Form

for Expert Review Items 1A through 1H was used to create and prioritize tasks based

on design areas related to known use errors. Tasks with an “*” in the list below were

identified to be highest priority related to risks based on this exercise.

The clinical tasks, ordered and marked in priority order, were:

1. Order medications based on the information provided,*

2. Review and act upon drug/drug and drug/allergy interactions,*

3. Review allergies and make updates based on the information provided,*

4. Change a medication order, electronically prescribe medication,

5. Review medications and make updates based on the information provided.

The configuration tasks, ordered and marked in priority order, were:

1. Change the severity of a drug-drug interaction alert.*

3.4 Procedures

Upon the start of an online session, participants were greeted; their identity was

verified and matched with a name on the participant schedule. Participants were then

assigned a participant ID. Each participant reviewed and agreed to the terms of an

informed consent and release form (See Appendix 2: NDA and Informed Consent

Form).

To ensure that the test ran smoothly, two human factors specialists, the moderator

and the data logger, participated in the clinical and configuration modules of the test.

The usability test team was made up of experienced usability practitioners all of

whom hold advanced degrees in human factors and have worked in human factors for

over a decade each.

The moderator administered instructions and tasks. The moderator also obtained task

times, obtained post-task rating data, and took notes on participant comments. A

second person served as the data logger and took notes on task success, path

deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible,

• With assistance; the moderator was allowed to give guidance and clarification

on tasks, and one instruction on use.

For each task the participants were given oral instructions and a written copy of the

task. Task timing began once the moderator finished reading the question. The task

Page 19: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 21 of 68

timer was stopped once the participant indicated they had completed the task.

Scoring is discussed below in Section 3.9.

Following the session, the moderator gave the participant the post-test questionnaire

(e.g., the System Usability Scale, see Appendix 5: System Usability Scale

Questionnaire) and thanked each individual for his/her participation.

Participants' demographic information, task success rate, time on task, errors,

deviations, verbal responses, and post-test questionnaire answers were recorded into

a spreadsheet.

3.5 Test Location

During the remote testing, the moderator was at her personal office, the data logger

was at her personal office, and each participant was either in their personal office or

in a quiet room.

3.6 Test Environment

ePrescribe is used in a healthcare office or facility. Usability remote testing was

conducted in personal offices or quiet spaces within a healthcare facility.

Twenty-one (21) clinical sessions and thirteen (13) configuration sessions were

conducted remotely via WebEx. Participants were instructed to call into an audio

conference and login to a WebEx meeting. Control of the moderator’s computer was

passed to the participant and sessions were moderated using the materials and

methods described in later chapters. All sessions were audio and video recorded.

For the test sessions, the computer used was a Lenovo T410 laptop running a

Microsoft Windows operating system. The data logger used either an ASUS G1S

laptop running a Microsoft Windows operating system or a MacBook Pro running OS X

10.8.4. Participants interacted using the keyboard and the mouse.

The Lenovo T410 laptop has a 14.1 inch display with 1440 x 900 resolution. The

resolution was set at 1440 x 900 pixels. The ASUS laptop has a 15.4 inch display with

1680 x 1050 resolution. The MacBook Pro has a 17" display set to 1920 x 1200

resolution. The application was set up by Allscripts according to the vendor’s

documentation describing the system set-up and preparation. The application itself

was running on a hosted web-based solution using a test database on a WAN

connection. Technically, the system performance (i.e., response time) was

representative of what actual users would experience in a field implementation.

Page 20: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 22 of 68

3.7 Test Forms and Tools

During the usability test, various documents and instruments were used, including:

1. Informed Consent

2. Moderator’s Guide

3. Task Detail Memory Aide

4. Post-Test Questionnaire

5. Incentive Receipt and Acknowledgment Form.

Examples of these documents can be found in Appendices 3-6 respectively. The

Moderator’s Guide was devised so as to be able to capture required data.

The participant’s interaction with the application was captured and recorded digitally

with screen capture software running on the test machine. A WebEx Recorder

recorded screen capture and verbal comments were recorded with a microphone.

3.8 Participant Instructions

The moderator read the following instructions aloud to each participant (also see the

full moderator’s guide in Appendix 3: Moderator Guides):

Thank you for participating in this study. Our session today will last 30 minutes/45 minutes/1 hour.

During that time you will take a look at an electronic health record system. The product you will be

using today is Allscripts ePrescribe.

We did not have any involvement in its creation. We are from an independent consulting company.

Companies hire our company to conduct activities with people like you that use their products,

services, and websites. This is a great chance for you to give the Allscripts team feedback about the

application we are going to look at today, so please be honest with your opinions.

I will ask you to complete a few activities using this system and answer some questions. We are

interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and

how we could improve it. Some activities might seem simple to you. Other activities might seem

difficult. And there will be some activities that you will not able to complete. I am telling you this

because I want you to remember that we are not testing you. We are testing the application.

You will be asked to complete these tasks on your own trying to do them as quickly as possible with

the fewest possible errors or deviations. Do not do anything more than asked. When you are doing

these activities, I am not going to interact or talk to you while you are completing the activity. I do

want you to talk aloud about what you are doing and thinking. You will say things like I am clicking

<say the place you clicked>, this is what I expected or I did not expect to see this. I will be taking

notes about what you are doing.

Because I am not going to be talking with you while you do the activities, I want you to make it clear

to me when you are done with an activity by saying "I'm done." There are a number of reasons you

might be done.

Page 21: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 23 of 68

(1) Done because you completed the activity.

(2) Done because you have tried, you know you have not completed the activity, but you are not

going to try anything else.

(3) Done because you feel like if you got a hint or asked a question you could finish the activity.

We are recording the audio and screenshots of our session today. All of the information that you

provide will be kept confidential and your name will not be associated with your comments at any

time.

Following the procedural instructions, participants were asked demographic

questions. Details regarding each task will be discussed in the specific chapter

sections. All tasks can be referenced in the Clinical and Configuration Moderator

Guides found in the Appendix 3: Moderator Guides.

3.9 Usability Metrics

According to the NIST Guide to the Processes Approach for Improving the Usability of

Electronic Health Records, EHRs and clinical modules should support a process that

provides a high level of usability for all users. The goal is for users to interact with the

system effectively, efficiently, and with an acceptable level of satisfaction. To this

end, metrics for effectiveness, efficiency and user satisfaction were captured during

the usability test. The goals of the test were to assess:

1. Effectiveness (safety-enhanced design) by measuring participant success rates

(pass/pass with help, fail) and usage errors. Including:

a. Identification and explanation of the root cause of any potential patient

safety risks resulting from usage errors in usability task performance,

b. Identification of potential mitigations for identified usage errors in

usability task performance.

2. Efficiency by measuring participant task time and task click path notes,

including identification of potential solutions for identified inefficiencies

observed in usability task performance.

3. System satisfaction by administering the System Usability Scale (SUS).

3.10 Data Scoring

Table 5 details how tasks were scored, errors evaluated, and the time data collected.

Page 22: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 24 of 68

Table 5: Details of How Observed Data were Scored.

Measures Rationale and Scoring

Effectiveness:

Task Success

A task was counted as a “Success” or “Pass” if the participant was

able to achieve the correct outcome, without assistance, within the

time allotted on a per task basis.

A task was counted as a “Success with Help” or “Pass with Help” if the

participant was able to achieve the correct outcome, with minimal

assistance, within the time allotted on a per task basis with help.

If the defined threshold time for inactivity (30 seconds) passed, the

moderator would stop the participant and give guidance.

The number of successes and successes with help were combined to

be Task Success.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or

performed the task incorrectly or was helped more than one time before

successful completion, the task was counted as a “Failure.” No task times

were taken for errors.

The total number of errors was calculated for each task. Not all deviations

were counted as errors.

On a qualitative level, an enumeration of errors and error types were

collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded.

Deviations occur if the participant, for example, went to a wrong screen,

clicked on an incorrect menu item, followed an incorrect link, or interacted

incorrectly with an on-screen control. This path was compared to the

optimal path. Task deviations were recorded and reported. Optimal paths

(i.e., procedural steps) were recorded when constructing tasks.

Efficiency:

Task Time

Each scenario was timed from when the moderator said “Begin” until the

participant said, “Done” for the scenario. If he or she failed to say “Done,”

the time was stopped when the participant stopped performing the last

subtask. Only the times for scenarios where all subtasks were successfully

completed were included in the average task time analysis. Average time

per scenario was calculated for each scenario. Standard deviation was also

calculated.

Satisfaction:

Task Rating

To measure participants’ satisfaction with the system, the testing team

administered the System Usability Scale (SUS) post-test questionnaire.

The SUS is a reliable and valid measure of system satisfaction. In order to

access system level satisfaction as opposed to feature level satisfaction and

as in common practice with the use of the SUS, we administered the

questionnaire at the end of the session. See full System Usability Score

questionnaire in Appendix 5: System Usability Scale Questionnaire.

Page 23: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 25 of 68

4 Clinical Test Results

Each Clinical Results chapter of the report presents the results associated with usability

test activities conducted with physician and nurse participants. The primary purpose of

this summative usability test is to provide objective evidence that the ePrescribe user

interface can be used in a safe, efficient, and effective manner with regard to the four

prioritized certification criteria. As such this section of the report presents a separate

chapter for each ePrescribe technology capability submitted for testing related to clinical

activities.

5 Chapter §170.314(a)(1) Computerized

Provider Order Entry (CPOE) Results

Although ePrescribe is not certifying on §170.314(a)(1) Computerized Provider Order

Entry (CPOE), ordering and changing a medication is an essential task used with the

ePrescribe system. So these tasks were included in the study.

5.1 Task Mapping

Page 24: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 26 of 68

Table 6 maps the Computerized Provider Order Entry criteria to usability test tasks to

aid verification that the report contains all required test scenarios for ePrescribe

module capability submitted for testing. Green colored font is used within the

certification criteria and within the steps for successful task completion to aid

verification that the usability test tasks address the details of the specified criteria.

Page 25: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 27 of 68

Table 6. Computerized Provider Order Entry Criteria Mapped to

Usability Test Tasks

§ 170.314(a)(1) Computerized Provider Order Entry Enable a user to electronically record, change, and access the following order types (i) Medications; (ii) Laboratory; and (iii) Radiology/imaging.

Task: Order and change specific medication based on the information provided. Combined with:

• § 170.314(a)(2) Drug-drug, drug-allergy interaction checks - (i) Intervention • § 170.314(a)(6) Medication List • § 170.314(b)(3) Electronic Prescribing

Only activities (bolded) associated with criteria will be discussed in this chapter. To successfully complete the task, participants were required to:

• Task 1.1: Record another doctor’s prescription • Task 1.2: Drug-Drug Interaction Checks • Task 1.3: Change Lipitor (atorvastatin calcium) prescription from 20mg daily to 40mg, 3 times

daily after meals • Task 1.4: Drug-Dosage Interaction Checks • Task 1.5: Change the prescription of Lipitor 40mg, 3 times daily after meals to Lipitor 40mg, 1

time a day.” • Task 1.6: Prescribe Cefdinir, 300mg, 1 Capsule twice daily for 10 days • Task 1.7: Drug-Allergy Interaction Checks • Task 1.8: Change the medication to Azithromycin, 250mg, 1 Tablet for 5 days

• Task 1.9: Electronically send order to the pharmacy • Task 1.10: Check what medications the patient has previously been on (review med list over

multiple encounters).

5.2 Task Participants and Instruction

Based on user characteristics, typical workflow, and tasks performed as part of their

daily work, physicians and nurses (on behalf of a physician) attempted this task.

Physician and nurse data were combined based on the fact that neither the task nor

the user characteristics differ based on these user roles.

Participants were given the following instruction:

For Mr. Smith’s sinus infection, please prescribe Cefdinir, 300mg, 1 Capsule twice daily for 10

days.

Change the medication to Azithromycin, 250mg, 1 Tablet for 5 days.

Page 26: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 28 of 68

5.3 Data Analysis and Reporting

Each of the following subtasks was used to assess task performance:

• Task 1.6: Prescribe Cefdinir, 300mg, 1 Capsule twice daily for 10 days

• Task 1.8: Change the medication to Azithromycin, 250mg, 1 Tablet for 5 days

Twenty one (21) participants attempted the scenario. Task performance was not

differentiated by provider type (physician or nurse, on behalf of the physician).

Therefore, the data was not separated by user role. Table 7 provides usability test

results for each subtask in CPOE.

Table 7. Usability Test Results for Each subtask in the

Computerized Provider Order Entry Task.

Measure

Subtask

N that

Attempted

Task

Task Success

Task Time

(min) Click Path

Notes:

Areas

Impacting

Efficiency n % Pass

%

Pass

with

Help

%

Fail

% Pass

+ Pass

with Help

Mean

(SD)

n Contributing

to Mean

Order a New

Medication 21 100% 0% 0% 100%

Values include

all ten sub-

tasks in the

task time.

14

(4)

7

Font size

can be

optimized

for scanning Change the

Medication 21 95% 5% 0% 100%

As indicated in the table:

• 100% (21 of 21) of participants were successful ordering the medication (Pass

+ Pass with help).

• 100% (21 of 21) of participants were successful changing the medication

(Pass + Pass with help).

5.4 Discussion of the Findings

The following sections discuss the results organized around a risk analysis of use, test

performance and error rates. The risk analysis of use includes identification of use

errors and user interface design issues as well as classification of severity based on

the consequence of the error. Use errors and user interface design issues that

resulted in subtask failures, that are known industry risk issues, and errors and issues

related to aspects of the user interface that are configured per customer site are

considered more severe compared to noncritical system usability issues related to

efficiency. As such the discussion of more serious errors and issues is provided in the

Page 27: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 29 of 68

Risk Analysis section and the associated mitigation strategy is provided in the Areas

for Improvement section.

Based on our definition of effectiveness metrics, performance, use errors and issues

stemming from task failures are also discussed in the Effectiveness section as

effectiveness was measured with task success and failure.

Noncritical system usability issues related to efficiency are discussed in the Efficiency

section. Associated recommendations are provided in the Areas for Improvement

section.

Satisfaction was evaluated at the system level. See Section 12: System Satisfaction

for System Usability Scale (SUS) findings.

The Major Findings section provides a brief summary of the findings related to

identified use errors, effectiveness, and efficiency.

RISK ANALYSIS

No critical use errors were identified or observed as part of this scenario.

EFFECTIVENESS

Performance of all subtasks was above the 95% success criterion.

EFFICIENCY

Noncritical system usability issues that did not result in use errors were identified

and provide an opportunity for improvement related to the efficiency of CPOE. A

user interface design element that contributed to poor efficiency was small font.

SATISFACTION

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

MAJOR FINDINGS

No critical user errors were observed during this task.

Page 28: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 30 of 68

AREAS FOR IMPROVEMENT

Areas for improvement related to effectiveness and efficiency include:

• Consider use of a font size that optimizes scanning.

Page 29: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 31 of 68

6 Chapter §170.314(a)(2) Drug-Drug, Drug-

Allergy Checks – Interventions Results

6.1 Task Mapping

Table 8 below maps Drug-Drug, Drug-Allergy Interaction Checks criteria to usability

test tasks to aid verification that the report contains all required test scenarios for

this EHR capability submitted for testing. Green colored font is used within the

certification criteria and within the steps for successful task completion to aid

verification that the usability test tasks address the details of the specified criteria.

Table 8. Drug-Drug and Drug-Allergy Checks - Criteria Mapped to

Usability Test Tasks.

§ 170.314 (a)(2) Drug-drug, drug-allergy interaction checks. (i) Interventions. Before a medication order is completed and acted upon during computerized provider order entry (CPOE), interventions must automatically and electronically indicate to a user drug-drug and drug-allergy contraindications based on a patient’s medication list and medication allergy list. (ii) Adjustments. (A) Enable the severity level of interventions provided for drug-drug interaction checks to be adjusted. (B) Limit the ability to adjust severity levels to an identified set of users or available as a system administrative function.

The assessment of the drug-drug and drug-allergy interaction criteria was distributed across:

• § 170.314(a)(1) Computerized Provider Order Entry • § 170.314(a)(6) Medication List • § 170.314(b)(3) Electronic Prescribing

Only activities (bolded) associated with criteria will be discussed in this chapter. To successfully complete this task, participants were required to complete each of the following subtasks.

• Task 1.1: Record another doctor’s prescription • Task 1.2: Review and act upon Drug-Drug Interaction Checks • Task 1.3: Change Lipitor (atorvastatin calcium) prescription from 20mg daily to 40mg, 3 times

daily after meals • Task 1.4: Review and act upon Drug-Dosage Interaction Checks • Task 1.5: Change the prescription of Lipitor 40mg, 3 times daily after meals to Lipitor 40mg, 1

time a day.” • Task 1.6: Prescribe Cefdinir, 300mg, 1 Capsule twice daily for 10 days • Task 1.7: Review and act upon Drug-Allergy Interaction Checks • Task 1.8: Change the medication to Azithromycin, 250mg, 1 Tablet for 5 days • Task 1.9: Electronically send order to the pharmacy • Task 1.10: Check what medications the patient has previously been on (review med list over

multiple encounters).

Page 30: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 32 of 68

6.2 Task Participants and Instruction

Based on user characteristics, typical workflow, and tasks performed as part of their

daily work, physicians and nurses (on behalf of a physician) attempted this task.

Physician and nurse data were combined based on the fact that neither the task nor

the user characteristics differentiate based on these user roles.

Participants were given following instructions related to Drug-Drug or Drug-

Allergy alerts: If any alerts/warnings come up during the session, please let me know what the alert or

warning is telling you and how you would typically handle the alert in your practice but

please wait and let me tell you how to handle the alert for the situation.

6.3 Data Analysis and Reporting

Doctors and nurses interacted with Drug-Drug and Drug-Allergy alerts in the context

of the CPOE and Medication List tasks. Only activities associated with the drug-drug

and drug-allergy criterion are reported in this chapter. See associated chapters for

results and discussion that were combined with this test scenario.

Twenty one (21) participants attempted the CPOE tasks. Task performance was not

differentiated by provider type (physician or nurse, on behalf of physician).

Therefore, the data was not separated by user role. Table 9 provides usability test

results for each subtask associated with Drug-Drug and Drug-Allergy interaction

checks.

Table 9. Usability Test Results for Each subtask in the Drug-Drug

and Drug-Allergy Checks - Interventions Task. Measure

Subtask

N that

Attempted

Task

Task Success

Task Time

(min)

Click Path

Notes:

Areas

Impacting

Efficiency n % Pass

% Pass

with

Help % Fail

% Pass +

Pass with

Help

Mean

(SD)

n

Contributing

to Mean

Review and act upon drug-drug alerts in

Medication

List

21 71% 0% 29% 71% Values include

all ten sub-

tasks in the

task time.

14

(4)

7

Font size can

be optimized

for scanning

Review and act upon drug-allergy alerts in CPOE

21 95% 0% 5% 95%

Page 31: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 33 of 68

As indicated in the table:

• 71% (15 of 21) of participants successfully (Pass) reviewed and properly acted

upon with Drug-Drug alerts.

• 95% (20 of 21) of participants successfully (Pass) reviewed and properly acted

upon with Drug-Allergy alerts.

6.4 Discussion of the Findings

The following sections discuss the results organized around a risk analysis of use, test

performance and error rates. The risk analysis of use includes identification of use

errors and user interface design issues as well as classification of severity based on

the consequence of the error. Use errors and user interface design issues that

resulted in subtask failures, that are known industry risk issues, and errors and issues

related to aspects of the user interface that are configured per customer site are

considered more severe compared to noncritical system usability issues related to

efficiency. As such the discussion of more serious errors and issues is provided in the

Risk Analysis section and the associated mitigation strategy is provided in the Areas

for Improvement section.

Based on our definition of effectiveness metrics, performance, use errors and issues

stemming from task failures are also discussed in the Effectiveness section as

effectiveness was measured with task success and failure.

Noncritical system usability issues related to efficiency are discussed in the Efficiency

section. Associated recommendations are provided in the Areas for Improvement

section.

Satisfaction was evaluated at the system level. See Section 12: System Satisfaction

for System Usability Scale (SUS) findings.

The Major Findings section provides a brief summary of the findings related to

identified use errors, effectiveness, and efficiency.

RISK ANALYSIS

Alert fatigue is a known issue across the industry and was validated during

usability testing. Alert fatigue is a serious issue that plagues providers.

Throughout the usability test session, participants interacted with a number of

alerts. In some cases the participant carefully reviewed the alert. In other cases

the participant routinely dismissed the alert.

Page 32: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 34 of 68

Many of the Meaningful Use 2 Safety Enhanced Design criteria involve alerts and

messages that interrupt the workflow. As such, much of the alerting experienced

during the usability test session was an artifact of testing. Configuration

purposely was not set to control the severity of alerts and the level of alerts

presented to the participants. Depending on the scenario, both critical and non-

critical alerts were appearing. Participants expressed frustration regarding over-

alerting in the usability test session and in their daily practice.

Participants were confused by one alert. Part of the confusion could have been

caused by the test workflow, which was later found to be an uncommon workflow

for many participants. The alert is designed to decrease the number of warnings

users can receive. However, some participants misunderstood how this alert

worked. This misunderstanding could lead to potentially serious consequences.

The root cause of this issue is that the instructions for the alert may have been

confusing and were not quick to scan.

The participants from the doctors’ user group had the same number of fails for

this subtask as the participants from the nurses’ user group, however after

dealing with the alert, if a warning was then displayed, participants from the

doctors’ user group were able to understand and use the warnings, whereas

participants from the nurses’ user group struggled with some of the warnings.

The difference between the two user groups is most likely based on medical

knowledge of drugs, including drug-drug interactions and overdoses. Participants

from the nurses’ user group weren’t always able to correctly identify the type of

warning, number of warnings and appropriate action to take, which could lead to

potentially serious consequences. The root cause of this issue is the ability to

quickly scan and identify the different warnings and diagnostic information within

the warnings.

EFFECTIVENESS

Performance of one subtask fell below 95% success criterion: reviewing and

acting upon the drug-drug alerts.

For the drug-drug alerts, five out of six of the participants who failed this task

were confused by the initial alert. The other participant who failed this task

thought that if it was a critical alert, it would have not let her send the

prescription onto the pharmacy (the next task). This could be an artifact of

testing if her system is set up such that critical alerts need override reasons.

The test system set-up did not require any override reasons and prescriptions

were always represented as being sent to the pharmacy.

Page 33: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 35 of 68

In some situations, participants were presented with a warning. The participants

from the doctors’ user group were able to understand and use the warnings;

however some participants from the nurses’ user group struggled with some of

the warnings. These participants weren’t always able to correctly identify the

type of warning, number of warnings and appropriate action to take.

Although the performance of the drug-allergy subtask met the 95% success

criterion, one participant missed the drug-allergy warning due to alert fatigue.

EFFICIENCY

Noncritical system usability issues that did not result in use errors were identified

and opportunities are provided for improvement related to the efficiency of the

drug-drug and drug-allergy alerts. Improvements could be made to the font size

to optimize visual scanning.

SATISFACTION

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

MAJOR FINDINGS

Performance related to drug-allergy alerts was at the 95% success criterion, but

performance related to drug-drug alerts was below the 95% success criterion.

Alert fatigue validated during usability testing highlights an industry-wide risk

concern. Many of the Meaningful Use 2 Safety Enhanced Design criteria involve

alerts and messages that interrupt the workflow. Depending on the scenario,

both critical and non-critical alerts were appearing. The non-critical alerts were

perceived as over-alerting, whereas the critical alerts were perceived as

sometimes valuable information.

One alert caused confusion for some participants. This alert is designed to

decrease the number of warnings users can receive. However, some participants

misunderstood how this alert worked. This misunderstanding could lead to

potentially serious consequences. All but one participant who failed this task

were also unfamiliar with the previous task’s workflow. Thus, this could be an

artifact of testing.

Participants from the nurses’ user group weren’t always able to quickly scan and

identify the different warnings, number of warnings and diagnostic information

within the warning. However, participants from the doctors’ user group were

Page 34: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 36 of 68

able to quickly scan the warnings and identify all of this information. This was

believed to be due to the difference in the medical knowledge of the two user

groups.

AREAS FOR IMPROVEMENT

The Allscripts team is well aware of the industry’s known issue with alert fatigue.

The team currently addresses alert fatigue by allowing sites to configure what

level(s) of alerts are presented. The Allscripts’ teams would like to call for and

be involved with the development and sharing of industry best practices,

guidelines, templates regarding safety-enhanced design that impacts patient

safety associated with alerts and alert fatigue.

Allscripts continues to work closely with customers to assure user-friendly

designs.

The current mitigation strategy for the alert that caused confusion for

participants when being notified about duplicate medications or drug-drug

interactions is:

• Use language in the alert that clarifies to the users how the alert works,

• Optimize for visual scanning.

The current mitigation strategy for screens where multiple warnings are

presented is:

• Consider improved content layout to optimize visual scanning and

information processing.

In addition, improvements to efficiency can be made to the font size to

optimize visual scanning.

Page 35: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 37 of 68

7 Chapter §170.314(a)(6) Medication List

Results

7.1 Task Mapping

Table 10 maps Medication List criteria to usability test tasks to aid verification that

the report contains all required test scenarios for this ePrescribe module capability

submitted for testing. Green colored font is used within the certification criteria and

within the steps for successful task completion to aid verification that the usability

test tasks address the details of the specified criteria.

Table 10. Medication List Criteria Mapped to Usability Test Tasks.

§ 170.314(a)(6) Medication List

Enable a user to electronically record, change, and access a patient’s active medication list as well as medication history: (i) Ambulatory setting. Over multiple encounters; or (ii) Inpatient setting. For the duration of entire hospitalization

Task: Review medications and make updates based on the information provided. Also, Check what medications the patient has previously been on. Combined with:

• § 170.314(a)(1) Computerized Provider Order Entry

• § 170.314(a)(2) Drug-drug, drug-allergy interaction checks - (i) Intervention • § 170.314(b)(3) Electronic Prescribing

Only activities (bolded) associated with criteria will be discussed in this chapter. To successfully complete this task, participants were required to complete each of the following tasks.

• Task 1.1: Record another doctor’s prescription (a patient-reported medication) • Task 1.2: Review and act upon Drug-Drug Interaction Checks • Task 1.3: Change Lipitor (atorvastatin calcium) prescription from 20mg daily to 40mg, 3

times daily after meals • Task 1.4: Review and act upon Drug-Dosage Interaction Checks • Task 1.5: Change the prescription of Lipitor 40mg, 3 times daily after meals to Lipitor

40mg, 1 time a day.” • Task 1.6: Prescribe Cefdinir, 300mg, 1 Capsule twice daily for 10 days • Task 1.7: Review and act upon Drug-Allergy Interaction Checks • Task 1.8: Change the medication to Azithromycin, 250mg, 1 Tablet for 5 days • Task 1.9: Electronically send order to the pharmacy • Task 1.10: Check what medications the patient has previously been on (review med list

over multiple encounters).

Page 36: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 38 of 68

7.2 Task Participants and Instruction

Based on user characteristics, typical workflow, and tasks performed as part of their

daily work, physicians and nurses (on behalf of a physician) attempted this task.

Physician and nurse data were combined based on the fact that neither the task nor

the user characteristics differ based on these user roles.

Participants were given the following instructions:

Your patient has come in for a lab check and a sinus infection. When you ask if his current

medications have changed, Mr. Smith informs you he is now seeing a neurologist for his TIAs

(transient ischemic attacks) and that the neurologist has put him on Plavix 75mg (Clopidogrel

Bisulfate), 1 tablet daily. Please update his medical record.

You had looked at Mr. Smith’s lab results and know his Lipitor needs to be increased. Mr.

Smith is currently taking Lipitor (atorvastatin calcium) 20mg daily. Change Mr. Smith’s

Lipitor prescription to 40mg, 3 times daily after meals.

You meant to prescribe Lipitor 1 time a day, change the prescription to Lipitor 40mg, 1 time

a day.

If you wanted to see what medications Mr. Smith has previously been on, please show me how you

would do that.

7.3 Data Analysis and Reporting

Each of the following subtasks was used to assess task performance:

• Task 1.1: Record another doctor’s prescription (patient-reported medication),

• Task 1.3: Change Lipitor (atorvastatin calcium) prescription from 20mg daily

to 40mg, 3 times daily after meals,

• Task 1.5: Change the prescription of Lipitor 40mg, 3 times daily after meals to

Lipitor 40mg, 1 time a day,”

• Task 1.10: Check what medications the patient has previously been on.

Twenty one (21) participants attempted the scenario. Task performance was not

differentiated by provider type (physician and nurse). Therefore, the data was not

separated by user role.

Page 37: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 39 of 68

Table 11 provides usability test results for each subtask in the Medication List task.

Page 38: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 40 of 68

Table 11. Usability Test Results for Each Subtask in the

Medication List Task.

* Failure of this task for five/seven participants was due to an artifact of testing.

** Failure of this task for two/three participants was due to an artifact of testing.

As indicated in the table:

• 67% of participants (14 of 21) successfully added a patient-reported medication

to a medication list (Pass + Pass with help),*

• 95% of participants (20 of 21) successfully changed a medication in the

medication list. (Pass + Pass with help),

• 95% of participants (19 of 20) successfully changed a medication in the

medication list. (Pass + Pass with help),

• 86% of participants (18 of 21) successfully reviewed the medications in the

medication list. (Pass + Pass with help).**

7.4 Discussion of the Findings

The following sections discuss the results organized around a risk analysis of use, test

performance and error rates. The risk analysis of use includes identification of use

errors and user interface design issues as well as classification of severity based on

Measure

Subtask

N that

Attempted

Task

Task Success

Task Time

(min)

Click Path Notes:

Areas Impacting

Efficiency n

%

Pass

%

Pass

with

Help % Fail

%

Pass +

Pass

with

Help

Mean

(SD)

n Contributing

to Mean

Add to the

Medication List

(record a

patient-reported

medication)

21 57% 10% 33%* 67%*

Values include

all ten sub-

tasks in the

task time.

14

(4)

7

Users would most

likely be faster

performing this task if

they could specify

that a medication was

prescribed by another

provider early in any

of the workflows that

support this task.

Change the

Medication List

(1)

21 95% 0% 5% 95%

Change the

Medication List

(2)

20 95% 0% 5% 95%

Review the

Medication List 21 76% 10% 14%** 86%**

Page 39: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 41 of 68

the consequence of the error. Use errors and user interface design issues that

resulted in subtask failures, that are known industry risk issues, and errors and issues

related to aspects of the user interface that are configured per customer site are

considered more severe compared to noncritical system usability issues related to

efficiency. As such the discussion of more serious errors and issues is provided in the

Risk Analysis section and the associated mitigation strategy is provided in the Areas

for Improvement section.

Based on our definition of effectiveness metrics, performance, use errors and issues

stemming from task failures are also discussed in the Effectiveness section as

effectiveness was measured with task success and failure.

Noncritical system usability issues related to efficiency are discussed in the Efficiency

section. Associated recommendations are provided in the Areas for Improvement

section.

Satisfaction was evaluated at the system level. See Section 12: System Satisfaction

for System Usability Scale (SUS) findings.

The Major Findings section provides a brief summary of the findings related to

identified use errors, effectiveness, and efficiency.

RISK ANALYSIS

When adding a medication prescribed by another provider to the patient’s

medication list, one participant was confused by the label of a button. The

participant tried to use this button, not the appropriate button, to add the

medication to the patient’s medication list. The medication was not added to the

patient’s medication list. Button confusion could pose serious consequences.

The root cause of this error is the label on the button.

EFFECTIVENESS

Performance of the subtask: Add to the Medication List (record a patient-

reported medication), fell below 95% success criterion. Some participants had

difficulty adding a medication that another doctor prescribed and the patient

reported, to the medication list. All participants except two who failed this task

specified that they did not or had never used this functionality in the ePrescribe

system for their practice. Some participants who correctly performed this task

also specified they did not or had never used this functionality in the ePrescribe

system for their practice. Since these participants who failed this task were

Page 40: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 42 of 68

unfamiliar with how to use this feature and had never been trained in it or had

forgotten their training, these failures should be considered an artifact of testing.

When adding a medication prescribed by another provider to the patient’s

medication list, of the two participants who failed this task, one participant was

confused by the label of a button and the other participant was given two hints to

complete the task.

Performance of the subtask: Review the Medication List, fell below 95% success

criterion. Two participants who were unable to complete this task had never

performed this task on their system. Therefore these failures would be

considered an artifact of testing. One participant did go into the correct screen,

but was unable to identify any previous medications the patient had been on.

EFFICIENCY

Noncritical system usability issues that did not result in use errors were identified

and opportunities are provided for improvement related to the efficiency of the

Medication List. Allow users to specify that a medication was prescribed by

another provider early in any workflow to complete this task.

SATISFACTION

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

MAJOR FINDINGS

Adding a patient-reported medication to the patient’s medication list is a feature

that is not commonly used by many of the participants. The unfamiliarity with

this subtask caused an increased failure rate and is considered an artifact of

testing. Of the two participants who did not specify they were unfamiliar with

this subtask and failed adding a medication to a patients’ medication list, one

failed because she was given two hints and one failed due to button confusion.

Button confusion could pose serious consequences. The root cause of this error

is the label on the button.

Participants were able to make changes to the medication lists and failures to

review past medications were most commonly associated with unfamiliarity with

the task. These failures would be considered an artifact of testing. One

participant did go into the correct screen, but was unable to identify any previous

medications the patient had been on.

Page 41: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 43 of 68

AREAS FOR IMPROVEMENT

The current mitigation strategy for the button that caused confusion when adding

other provider’s prescriptions to a patient’s medication list is to examine and

optimize the labeling of this button.

In addition, allowing users to specify that a medication was prescribed by

another provider early in any workflow to complete this type of task is expected

increase the users’ efficiency when performing this task.

Page 42: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 44 of 68

8 Chapter §170 314(a)(7) Medication Allergy

List Results

8.1 Task Mapping

Table 12 maps Medication Allergy List criteria to usability test tasks to aid verification

that the report contains all required test scenarios for this ePrescribe module

capability submitted for testing. Green colored font is used within the certification

criteria and within the steps for successful task completion to aid verification that the

usability test tasks address the details of the specified criteria.

Table 12. Medication Allergy List Criteria Mapped to Usability Test

Tasks.

§170.314(a)(7) Medication Allergy List

Enable a user to electronically record, change, and access a patient’s active medication allergy list

as well as medication allergy history:

(i) Ambulatory setting. Over multiple encounters; or

(ii) Inpatient setting. For the duration of an entire hospitalization.

Task: Review allergies and make updates based on the information provided.

Only activities (bolded) associated with criteria will be discussed in this chapter. To successfully

complete the task, participants were required to:

• Task 2.1: Change patient’s existing drug allergy, add reaction (anaphylaxis) to Aldactone • Task 2.2: Add allergy to Bactrim • Task 2.3: Review Allergy List

8.2 Task Participants and Instruction

Based on user characteristics, typical workflow, and tasks performed as part of their

daily work, physicians and nurses (on behalf of a physician) attempted this task.

Physician and nurse data were combined based on the fact that neither the task nor

the user characteristics differ based on these user roles.

Page 43: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 45 of 68

Participants were given the following instruction: Your next patient has stated that she recently found out 3 days ago she was allergic to

Bactrim after taking it for an upper respiratory infection. She stated she developed a mild

skin rash.

Also, your patient used to have a rash when she took Aldactone (spironolactone), but she

tells you she is developing hives along with the rash when taking Aldactone (spironolactone).

Please update the patient’s chart.

Please look to see all the allergies this patient has ever had documented in her chart.

8.3 Data Analysis and Reporting

Each of the following subtasks was used to assess task performance:

• Task 2.1: Add reaction (hives) to Aldactone

• Task 2.2: Add allergy to Bactrim • Task 2.3: Review patient’s Allergy List

Twenty (20) participants attempted the scenario. Task performance was not

differentiated by provider type (physician and nurse). Therefore, the data was not

separated by user role. Table 13 below provides usability test results for each

subtask in the Medication Allergy List task.

Table 13. Medication Allergy List Criteria Mapped to Usability Test

Tasks Measure

Subtask

N that

Attempted

Task

Task Success

Task Time

(min)

Click Path Notes:

Areas Impacting Efficiency n

%

Pass

%

Pass

with

Help

%

Fail

%

Pass

+

Pass

with

Help

Mean

(SD)

n Contributing

to Mean

Change

patient’s

existing

drug

allergy

20* 60% 20% 20% 80%

4

(1.5)

15

Optimizing screen layout

may help quicken scanning

abilities and data entry

Add new

drug

allergy

20* 80% 15% 5% 95%

Review

Allergy

List

20* 75% 25% 0% 100%

* Due to technical difficulties during a session, one participant was unable to complete this task due to

time limitations.

Page 44: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 46 of 68

As indicated in the table:

• 80% (16 of 20) of the participants successfully changed the severity of the

allergy reaction (Pass + Pass with help),

• 95% (19 of 20) of the participants successfully added a new allergy of Bactrim

along with the reaction of rash (Pass + Pass with help),

• 100% (20 of 20) of the participants successfully reviewed patient’s allergy list

(Pass + Pass with help).

8.4 Discussion of the Findings

The following sections discuss the results organized around a risk analysis of use, test

performance and error rates. The risk analysis of use includes identification of use

errors and user interface design issues as well as classification of severity based on

the consequence of the error. Use errors and user interface design issues that

resulted in subtask failures, that are known industry risk issues, and errors and issues

related to aspects of the user interface that are configured per customer site are

considered more severe compared to noncritical system usability issues related to

efficiency. As such the discussion of more serious errors and issues is provided in the

Risk Analysis section and the associated mitigation strategy is provided in the Areas

for Improvement section.

Based on our definition of effectiveness metrics, performance, use errors and issues

stemming from task failures are also discussed in the Effectiveness section as

effectiveness was measured with task success and failure.

Noncritical system usability issues related to efficiency are discussed in the Efficiency

section. Associated recommendations are provided in the Areas for Improvement

section.

Satisfaction was evaluated at the system level. See Section 12: System Satisfaction

for System Usability Scale (SUS) findings.

The Major Findings section provides a brief summary of the findings related to

identified use errors, effectiveness, and efficiency.

RISK ANALYSIS

Some participants were unable to determine how to enter two reactions for one

medication allergy, so they entered just one reaction. One of these participants

thought she entered the stronger reaction, but she really entered the weaker

reactions; she entered Rash and not Hives. Entering a less significant allergic

Page 45: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 47 of 68

reaction can have serious consequences. The root cause of this issue is that it

was not clear to all the participants how more than one reaction to a medication

allergy is entered into the system.

Another participant entered allergic reactions for the wrong medication allergy.

The consequences of this can be serious. The root cause of this issue is that the

participant was not given clear feedback concerning which medication allergy

s/he was entering reactions for.

EFFECTIVENESS

Performance of one subtask fell below 95% success criterion: changing the

existing patient’s drug allergy. A few participants did not know how to put in

more than one allergic reaction, including one participant who put in the weaker

reaction instead of the stronger reaction.

EFFICIENCY

Noncritical system usability issues that did not result in use errors were identified

and opportunities are provided for improvement related to the efficiency of the

medication allergy lists. Improvements could be made to optimize page layout.

SATISFACTION

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

MAJOR FINDINGS

Performance related to one of the medication allergy list subtasks was below the

95% success criterion. A few participants did not know how to put in more than

one allergic reaction, including one participant who put in the weaker reaction

instead of the stronger reaction. Entering a less significant allergic reaction can

have serious consequences. The root cause of this issue is that it was not clear

to all the participants how more than one reaction to a medication allergy is

entered into the system.

One participant entered allergic reactions for the wrong medication allergy. The

consequences of this can be serious. The root cause of this issue is that the

participant was not given clear feedback concerning which medication allergy

s/he was entering reactions for.

Page 46: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 48 of 68

AREAS FOR IMPROVEMENT

The current mitigation strategy is:

• If a medication allergy is being edited, clearly display the medication name,

• Allow users to more easily choose additional reactions to add to a medication

allergy.

In addition, improvements can be made to optimize page layout for quicker

scanning and data entry.

Page 47: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 49 of 68

9 Chapter §170 314(b)(3) Electronic Prescribing

9.1 Task Mapping

Table 14 maps the Electronic Prescribing criteria to usability test tasks to aid

verification that the report contains all required test scenarios for this ePrescribe

module capability submitted for testing. Green colored font is used within the

certification criteria and within the steps for successful task completion to aid

verification that the usability test tasks address the details of the specified criteria.

Table 14. Electronic Prescribing Criteria Mapped to Usability Test

Tasks.

§170.314(b)(3) Electronic Prescribing. Enable a user to electronically create prescriptions and

prescription-related information for electronic transmission in accordance with:

(i) The standard specified in §170.205(b)(2); and

(ii) At a minimum, the version of the standard specified in § 170.207(d)(2).

Task: And send the prescription electronically to the pharmacy (ePrescribe). Combined with:

• § 170.314(a)(1) Computerized Provider Order Entry • § 170.314(a)(2) Drug-drug, drug-allergy interaction checks - (i) Intervention • § 170.314(a)(6) Medication List

Only activities (bolded) associated with criteria will be discussed in this chapter. To successfully complete

the task, participants were required to: • Task 1.1: Record another doctor’s prescription • Task 1.2: Drug-Drug Interaction Checks • Task 1.3: Change Lipitor (atorvastatin calcium) prescription from 20mg daily to 40mg, 3 times

daily after meals • Task 1.4: Drug-Dosage Interaction Checks • Task 1.5: Change the prescription of Lipitor 40mg, 3 times daily after meals to Lipitor 40mg, 1

time a day.” • Task 1.6: Prescribe Cefdinir, 300mg, 1 Capsule twice daily for 10 days • Task 1.7: Drug-Allergy Interaction Checks • Task 1.8: Change the medication to Azithromycin, 250mg, 1 Tablet for 5 days • Task 1.9: Electronically send order to the pharmacy • Task 1.10: Check what medications the patient has previously been on (review med list over

multiple encounters).

9.2 Task Participants and Instruction

Based on user characteristics, typical workflow, and tasks performed as part of their

daily work, physicians and nurses (on behalf of a physician) attempted this task.

Page 48: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 50 of 68

Physician and nurse data were combined based on the fact that neither the task nor

the user characteristics differ based on these user roles.

Participants were given the following instruction:

Show me how you would electronically send this order to the pharmacy.

9.3 Data Analysis and Reporting

Each of the following subtasks was used to assess task performance:

• Task 1.9: Electronically send order to the pharmacy

Twenty one (21) participants attempted the electronic prescribing task. Task

performance was not differentiated by provider type (physician or nurse, on behalf of

physician). Therefore, the data was not separated by user role. Table 15 provides

usability test results for this subtask.

Table 15. Usability Test Results Associated with Electronic

Prescription.

Measure

Subtask

N that

Attempted

Task

Task Success

Task Time (min)

Click Path Notes:

Areas Impacting

Efficiency n % Pass

%

Pass

with

Help

%

Fail

% Pass

+ Pass

with

Help

Mean

(SD)

n Contributing to

Mean

Electronically

send the

prescription

21 100% 0% 0% 100%

Values include all

ten sub-tasks in

the task time.

14

(4)

7

No areas

impacting

efficiency were

identified.

As indicated in the table:

• 100% (21 of 21) of participants successfully (Pass + Pass with help)

electronically sent the order to the pharmacy.

9.4 Discussion of the Findings

The following sections discuss the results organized around a risk analysis of use, test

performance and error rates. The risk analysis of use includes identification of use

errors and user interface design issues as well as classification of severity based on

the consequence of the error. Use errors and user interface design issues that

Page 49: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 51 of 68

resulted in subtask failures, that are known industry risk issues, and errors and issues

related to aspects of the user interface that are configured per customer site are

considered more severe compared to noncritical system usability issues related to

efficiency. As such the discussion of more serious errors and issues is provided in the

Risk Analysis section and the associated mitigation strategy is provided in the Areas

for Improvement section.

Based on our definition of effectiveness metrics, performance, use errors and issues

stemming from task failures are also discussed in the Effectiveness section as

effectiveness was measured with task success and failure.

Noncritical system usability issues related to efficiency are discussed in the Efficiency

section. Associated recommendations are provided in the Areas for Improvement

section.

Satisfaction was evaluated at the system level. See Section 12: System Satisfaction

for System Usability Scale (SUS) findings.

The Major Findings section provides a brief summary of the findings related to

identified use errors, effectiveness, and efficiency.

RISK ANALYSIS

No critical use errors were observed as part of this usability task.

EFFECTIVENESS

Performance of this subtask was above the 95% success criterion.

EFFICIENCY

Noncritical system usability issues that did not result in use errors were identified

and provide an opportunity for improvement related to the efficiency of electronic

prescribing. No additional opportunities for improvement were identified.

SATISFACTION

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

Page 50: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 52 of 68

MAJOR FINDINGS

No critical user errors were observed during this task.

AREAS FOR IMPROVEMENT

No additional opportunities for improvement were identified.

Page 51: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 53 of 68

10 Configuration Test Results

The Configuration Test Results chapter of the report presents the results associated with

usability test activities conducted with configuration specialist participants. The primary

purpose of this summative usability test is to provide objective evidence that the system

configuration user interface can be used in a safe, efficient, and effective manner with

regard to the certification criteria associated with configuration.

11 Chapter §170.314(a)(2)(ii)(A) Drug-Drug

Interaction Checks – Adjustments Results

11.1 Task Mapping

Table 16 maps the Drug-Drug, Drug-Allergy Interaction Checks – Adjustments criteria

to usability test tasks to aid verification that the report contains all required test

scenarios for this EHR capability submitted for testing. Green colored font is used

within the certification criteria and within the steps for successful task completion to

aid verification that the usability test tasks address the details of the specified

criteria.

Table 16. Drug-Drug, Drug-Allergy Interaction Checks -

Adjustments Criteria Mapped to Usability Test Tasks

§ 170.314(a)(2)(ii) Drug-Drug, Drug-Allergy Interaction Checks – Adjustments

(A) Enable the severity level of interventions provided for drug-drug interaction checks to be

adjusted.

(B) Limit the ability to adjust severity levels to an identified set of users or available as a system

administrative function.

Note: This system can only limit the ability to adjust severity levels at the individual level by

assigning the individual an administrator role. Both Allscripts and some of the system

administrators who participated in this test confirmed this. Thus, the criterion was not included in

the usability test.

Drug-Drug, Drug-Allergy Interaction Check – Adjustments criteria were evaluated through the

following task.

Task: Adjust Severity Levels of Drug Interactions.

Only activities (bolded) associated with criteria will be discussed in this chapter. To successfully

complete the tasks, participants were required to:

• Task 3.1: Locate and Change Drug-Drug Interaction Severity Level

Page 52: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 54 of 68

11.2 Task Participants and Instruction

Based on user characteristics, typical workflow, and tasks performed as part of

their daily work, administrators and IT specialists attempted this task. IT

specialists’ data and administrator’s data were combined based on the fact that

neither the task nor the user characteristics differ based on the source of the

participants. Participants were given the following instruction:

Prescribing doctors are complaining about being notified too often for drug-drug

interactions. Change the system settings so that only the most severe drug-drug

interactions alerts will pop up for the doctors.

11.3 Data Analysis and Reporting

The following task was used to assess task performance

• Task 3.1: Change drug-drug interaction setting to Major.

Thirteen (13) participants consisting of four (4) physicians, four (4) nurses

and five (5) IT specialists with administrative rights attempted the scenario.

Task performance was not differentiated by participant’s background.

Therefore, the data was not separated by the source of the participant. Table

17 provides usability test results associated with the configuration task.

Table 17. Usability Test Results Associated with Adjust Severity

Levels. Measure

Subtask

N that Attempted

Task

Task Success

Task Time (min)

Click Path Notes: Areas

Impacting Efficiency n

% Pass

% Pass with Help

% Fail

% Pass +

Pass with Help

Mean (SD) n Contributing to Mean

Locate and change drug-drug interaction severity level

13 85% 15% 0% 100%

2.5

(1.3)

13

Page layout

can be

optimized

for quicker

scanning

As indicated in the table:

• 100% (13 of 13) of participants successfully located and changed Drug-

Utilization Review severity level (Pass + Pass with help).

Page 53: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 55 of 68

11.4 Discussion of the Findings

The following sections discuss the results organized around a risk analysis of use,

test performance and error rates. The risk analysis of use includes identification

of use errors and user interface design issues as well as classification of severity

based on the consequence of the error. Use errors and user interface design

issues that resulted in subtask failures, that are known industry risk issues, and

errors and issues related to aspects of the user interface that are configured per

customer site are considered more severe compared to noncritical system

usability issues related to efficiency. As such the discussion of more serious

errors and issues is provided in the Risk Analysis section and the associated

mitigation strategy is provided in the Areas for Improvement section.

Based on our definition of effectiveness metrics, performance, use errors and

issues stemming for task failures are also discussed in the Effectiveness section

as effectiveness was measured with task success and failure.

Noncritical system usability issues related to efficiency are discussed in the

Efficiency section. Associated recommendations are provided in the Areas for

Improvement section.

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

RISK ANALYSIS

No critical use errors were observed as part of this usability task.

EFFECTIVENESS

Performance of this subtask was above the 95% success criterion.

EFFICIENCY

Noncritical system usability issues that did not result in use errors were identified

and provide an opportunity for improvement related to the efficiency of

configuring. A user interface design element that contributed to poor efficiency

was the design of the page layout.

SATISFACTION

Satisfaction was evaluated at the system level. See Section 12: System

Satisfaction for System Usability Scale (SUS) findings.

Page 54: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 56 of 68

MAJOR FINDINGS

No critical user errors were observed during this task.

AREAS FOR IMPROVEMENT

Improvements can be made to optimize page layout for quicker scanning.

Page 55: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 57 of 68

12 System Satisfaction

12.1 About System Usability Scale (SUS) Scores

Participants completed the System Usability Scale (SUS) questionnaire at the end of

their session. The SUS is a reliable and valid measure of system satisfaction. Sauro

(http://www.measuringusability.com/sus.php accessed 3/14/2013) reports, the

average SUS score from 500 studies across various products e.g., websites, cell

phones, enterprise systems and across different industries is a 68. A SUS score

above a 68 is considered above average and anything below 68 is below average.

User-View encourages teams not to focus on the comparison to the cross industry

average SUS of 68 reported by Sauro. Instead, we encouraged teams to use the SUS

as a measure to compare their own usability improvement in the application as

changes are made.

12.2 Clinical System Satisfaction Results

Eleven (11) physicians and 10 nurses completed the SUS questionnaire at the end of

their session. The system scored an average of 73.3 (SD=16.6). Nurses (n=10)

rated the system as 79 (SD=13.4), and physicians (n=11) rated the system as 68.2

(SD=18.1).

12.3 Configuration System Satisfaction Results

Based on SUS ratings from 13 configuration specialists, the configuration area scored

an average of 73.5 (SD=16.2).

Page 56: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 58 of 68

13 Appendices

The following appendices include supplemental data for this usability test report.

Following is a list of the appendices provided:

1. Recruiting Screener,

2. Non-Disclosure Agreement (NDA) and Informed Consent Form,

3. Moderator’s Guides,

4. Task Detail Memory Aide,

5. System Usability Scale Questionnaire,

6. Incentive Receipt and Acknowledgment Form.

Page 57: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 59 of 68

13.1 Appendix 1: Recruiting Screener

The purpose of a screener to ensure that the participants selected represent

the target user population as closely as possible.

Page 58: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 60 of 68

Page 59: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 61 of 68

Page 60: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 62 of 68

Page 61: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 63 of 68

13.2 Appendix 2: NDA and Informed Consent Form

Volunteer, Non-Disclosure, and Video Consent Form

I voluntarily agree to participate in an evaluation being conducted by User-View, Inc. of Raleigh, North Carolina. This evaluation is designed to provide feedback

regarding an Allscripts module. During the evaluation, I understand that I may learn information that is

confidential to User-View or its clients. I agree to treat all confidential information received during this evaluation in accordance with this non-disclosure agreement. Accordingly, I will not disclose confidential information to any third parties.

I authorize User-View to keep, preserve, use in any manner and dispose of the findings from this evaluation, including my feedback and opinions expressed. User-View will not associate my name or company name with the results of this

evaluation. I give my permission for User-View to make video and audio records of me during this evaluation. I understand that these recordings can be used only for evaluation

purposes and can be used for no other purpose without my knowledge and consent. I understand that my participation is completely voluntary and that I may leave at

any time.

_________________ _________________________ _______

Name (Please Print) Signature Date

Page 62: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 64 of 68

13.3 Appendix 3: Moderator Guides

Allscripts ePrescribe EHR Usability Test

Moderator’s Guide Administrator ________________________ Data Logger ________________________ Date _____________________________ Time _________

Participant # ________ Location ____________________________

Orientation (5 minutes) Hello. Welcome, my name is _____________ and I’m a moderator, also we have ______________ online with us, she is the data logger.

Just so we are all on the same page... We have you here today, not to look at still images of screens but to actually perform tasks with a prototype system in order to evaluate the extent to which the application meets Meaningful Use certification criteria. The first thing I would like to do is to show you this consent form to participate. <<show consent form – provide highlights of form >> Do you have any questions?

<<Show Invoice Form>> (must check before each participant) Can you please write your address here? <<start recording - shows welcome PPT Slide 2>> Any questions? OK, so I will have it on the record that you provided consent.

This is what you agreed to on the consent form – I show it here so research teams that might go back and view this recording are under the same expectations. --- I am going to read this introduction to you because I want to be sure that I don’t miss anything.

Thank you for participating in this study. Our session today will last 30 minutes/45

minutes/1 hour. During that time you will take a look at an electronic health record system. The product you will be using today is Allscripts ePrescribe. We did not have any involvement in its creation. We are from an independent consulting company. Companies hire our company to conduct activities with people like you that use

their products, services, and websites. This is a great chance for you to give the Allscripts

Page 63: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 65 of 68

team feedback about the application we are going to look at today, so please be honest with your opinions. I will ask you to complete a few activities using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful

to you, and how we could improve it. Some activities might seem simple to you. Other activities might seem difficult. And there will be some activities that you will not able to complete. I am telling you this because I want you to remember that we are not testing you. We are testing the application. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked.

When you are doing these activities, I am not going to interact or talk to you while you are completing the activity. I do want you to talk aloud about what you are doing and thinking. You will say things like I am clicking <say the place you clicked>, this is what I expected or I did not expect to see this. I will be taking notes about what you are doing. Because I am not going to be talking with you while you do the activities, I want you to

make it clear to me when you are done with an activity by saying "I'm done." There are a number of reasons you might be done. (1) Done because you completed the activity. (2) Done because you have tried, you know you have not completed the activity, but you are not going to try anything else. (3) Done because you feel like if you got a hint or asked a question you could finish the activity.

We are recording the audio and screenshots of our session today. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Do you have any questions or concerns before we begin??

What is your job title? How long have you been in your profession?

What is your specialty? How many years you have used ePrescribe EHR? During the session, we may ask you to perform clinically questionable tasks. That is

because we need to stress the system. If any alerts/warnings come up during the session, please let me know what the alert or warning is telling you and how you would typically handle the alert in your practice but please wait and let me tell you how to handle the alert for the situation.

Task 1: Record Medication, Change Medication, DxD, CPOE Order med, ePrescribe

Page 64: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 66 of 68

“Mr. Smith has come in for a lab check and a sinus infection. When you ask if his current

medications have changed, Mr. Smith informs you he is now seeing a neurologist for his TIAs (transient ischemic attacks) and that neurologist has put him on Plavix 75mg

(Clopidogrel Bisulfate), 1 tablet daily. Please update his medical record.”

Here is your cheat sheet for this activity. Do you have any questions about what I am asking you to do? Please remember to talk out loud as you complete the task and tell me when you are done. Pull the chart of:________________________________________ and orient yourself to

the patient and then proceed with the task. START TIME .

Task

Error

Discussion

Comments

Task 1.1: Record another doctor’s prescription.

Task 1.2: <<Drug-Drug Interaction Check: Aspirin & Plavix

– ignore>>

Task 1.3: “You had looked at Mr. Smith’s lab results and

know his Lipitor needs to be increased. Mr. Smith is

currently taking Lipitor (atorvastatin calcium) 20mg daily.

Change Mr. Smith’s Lipitor prescription to 40mg, 3 times

daily after meals.

Task 1.4: <<Drug-Dosage Interaction Checker>>

Task 1.5: “Oops, you meant to prescribe Lipitor 1 time a

day, change the prescription to Lipitor 40mg, 1 time a day.”

Task 1.6: “For Mr. Smith’s sinus infection, please prescribe

Cefdinir, 300mg, 1 Capsule twice daily for 10 days”

Task 1.7: <<Drug-Allergy Interaction Check – Cefdinir and Penicillin>>

Page 65: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 67 of 68

Task 1.8: “Change the medication to Azithromycin, 250mg,

1 Tablet for 5 days.”

Task 1.9: “Show me how you would electronically send this

order to the pharmacy.“

Task 1.10: “If you wanted to see what medications Mr.

Smith has previously been on, please show me how you

would do that.”

Task End Time:________________________________________ Task 2: Record Medication Allergy, Change Medication Allergy

“Your next patient Mrs. Mitchell, has stated that she recently found out 3 days ago she was

allergic to Bactrim after taking it for an upper respiratory infection. She stated she

developed a mild skin rash.

Also, Mrs. Mitchell used to have a rash when she took Aldactone (spironolactone), but she

tells you she is developing hives along with the rash when taking Aldactone (spironolactone).

Please update the patient’s chart.”

Here is your cheat sheet for this activity. Do you have any questions about what I am asking you to do? Again, please remember to talk out loud as you complete the task and tell me when you are done. Pull the chart of:________________________________________ and orient yourself to the patient and then proceed with the task. START TIME

.

Task

Error

Discussion

Comments

Task 2.1: Add reaction (hives) to Aldactone.

Task 2.2: Add allergy to Bactrim.

Task 2.3: “Please look to see all the allergies Mrs. Mitchell has

had documented in her chart.”

Task End Time:________________________________________

Administer the SUS

Page 66: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 68 of 68

Configuration UsabilityTest

“Prescribing doctors are complaining about being notified too often for drug-drug

interactions. Change the system settings so that only the most severe drug-drug

interactions alerts will pop up for the doctors.”

Do you have any questions about what I am asking you to do? Remember to talk out loud as you complete the task and tell me when you are done. START TIME

Task

Error

Discussion

Comments

Task 3.1: Change Drug-Drug Interaction setting to Major.

“If a physician is logged in as a ‘physician’ and not a ‘system

administrator’, can they make similar adjustments to the

system?“

Yes _____ No_______

Task End Time:________________________________________

Administer the SUS

Page 67: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 69 of 68

13.4 Appendix 4: Task Detail Memory Aide

Usability Test Scenarios - ePrescribe

Scenario 1

1. Update Mr. Smith’s medical record with Plavix 75mg 1 tablet daily.

2. Change current prescription of Lipitor (atorvastatin calcium) 20mg daily to Lipitor (atorvastatin calcium) 40mg, 3 tablets daily after meals, days: 30, qty: 90, refills: 0.

3. Prescribe Cefdinir (Omnicef) 300mg 1 Capsule Twice Daily for 10 days.

4. Send in e-prescription

5. Review the patient’s past medications.

.

Scenario 2

1. Add new drug allergy information – Bactrim (mild skin rash).

2. Change existing patient’s drug allergy’s reaction – Aldactone (hives and rash).

3. Review patient’s existing and past drug allergies.

Scenario 3

1. Change the system settings so that only the most severe drug-drug interactions alerts will

pop up.

Page 68: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 70 of 68

13.5 Appendix 5: System Usability Scale Questionnaire

Please read each statement below and indicate how you feel about it on a scale of 1 to 5, ranging

from Strongly Disagree to Strongly Agree. Please submit your responses by clicking the Submit

button at the end of this form. If you have any questions, please don't hesitate to ask. Thank you!

Participants were told to base their answers only on the system used in the usability session.

Strongly

Agree

Strongly

Agree

1 2 3 4 5

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person

to be able to use this system

5. I found the various functions in this system were well

integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this

system very quickly

8. I found the system very cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of things before I could get going

with this system

Page 69: Meaningful Use 2 Summative Usability Test Report · Solution Design Document 18-Nov-13 Page 8 of 68 ePrescribe v17.1 Summative Usability Test Report Report based on ISO/IEC 25062:2006

Solution Design Document 18-Nov-13 Page 71 of 68

13.6 Appendix 6: Incentive Receipt and Acknowledgment Form

INVOICE DATE: ______________________ (today’s date) TO: User-View, Inc. 1109 Holland Ridge Dr.

Raleigh, NC 27603 FROM: ______________________________________ (Participant) ADDRESS: ______________________________________ ______________________________________

______________________________________ PHONE: ______________________________________ EMAIL: ______________________________________

REFERRAL – Did someone refer you to participate in this study? If yes, please provide that person’s name so we can thank them. _____________________ Note: We will need your Social Security Number if you earn $600 or more in one year.

Participant Signature: ______________________________________ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - REFERENCE: Allscripts Evaluation

TOTAL AMOUNT PAID: $ _______