ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014...

158
Test Results Summary for 2014 Edition EHR Certification 142412R0077PRA Version 1.1, February 28, 2016 Part 1: Product and Developer Information 1.1 Certified Product Information 1.2 Developer/Vendor Information Part 2: ONCAuthorized Certification Body Information 2.1 ONCAuthorized Certification Body Information ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Product Name: PracticeSuite Product Version: EHR17.0.0 Domain: Ambulatory Newark, CA 94560 Website: www.practicesuite.com Email: [email protected] Phone: (510) 2842425 Test Type: Complete EHR Developer/Vendor Name: PracticeSuite, Inc. Address: 37600 Central Court Suite #260 San Luis Obispo, CA 93401 Website: www.infogard.com Email: [email protected] Phone: (805) 7830810 Developer/Vendor Contact: Samuel Peirce ONCACB Name: InfoGard Laboratories, Inc. Address: 709 Fiero Lane Suite 25 Signature and Date ONCACB Contact: Adam Hardcastle This test results summary is approved for public release by the following ONCAuthorized Certification Body Representative: Adam Hardcastle EHR Certification Body Manager ONCACB Authorized Representative Function/Title 2/28/2016 ©2016 InfoGard. May be reproduced only in its original entirety, without revision 1

Transcript of ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014...

Page 1: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Part 1: Product and Developer Information1.1 Certified Product Information

1.2 Developer/Vendor Information

Part 2: ONC‐Authorized Certification Body Information2.1 ONC‐Authorized Certification Body Information

ONC HIT Certification Program 

Test Results Summary for 2014 Edition EHR Certification

Product Name: PracticeSuite

Product Version: EHR‐17.0.0

Domain:  Ambulatory

Newark, CA 94560

Website: www.practicesuite.com

Email: [email protected]

Phone: (510) 284‐2425

Test Type: Complete EHR

Developer/Vendor Name: PracticeSuite, Inc.

Address: 37600 Central Court Suite #260

San Luis Obispo, CA 93401

Website: www.infogard.com

Email: [email protected]

Phone: (805) 783‐0810

Developer/Vendor Contact: Samuel Peirce

ONC‐ACB Name: InfoGard Laboratories, Inc.

Address: 709 Fiero Lane Suite 25

Signature and Date

ONC‐ACB Contact: Adam Hardcastle

This test results summary is approved for public release by the following ONC‐Authorized Certification Body 

Representative:

Adam Hardcastle EHR Certification Body Manager

ONC‐ACB Authorized Representative Function/Title

2/28/2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 1

Page 2: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2.2 Gap Certification

(a)(1) (a)(17) (d)(5) (d)(9)

(a)(6) (b)(5)* (d)(6) (f)(1)

(a)(7) (d)(1) (d)(8)

*Gap certification allowed for Inpatient setting only

2.3 Inherited CertificationThe following identifies criterion or criteria certified via inherited certification

The following identifies criterion or criteria certified via gap certification

§170.314

        No gap certification

§170.314

      (a)(3)       (a)(16)  Inpt. only       (d)(2)       (f)(3)

      (a)(4)       (a)(17) Inpt. only       (d)(3)       (f)(4) Inpt. only

      (a)(1)       (a)(14)       (c)(3)       (f)(1)

      (a)(2)       (a)(15)       (d)(1)       (f)(2)

      (a)(7)       (b)(3)       (d)(6)       (f)(6) Optional &                          Amb. only      (a)(8)       (b)(4)       (d)(7)

      (a)(5)       (b)(1)       (d)(4)       (f)(5) Optional &                           Amb. only      (a)(6)       (b)(2)       (d)(5)

      (g)(3)

      (a)(12)       (c)(1)       (e)(2) Amb. only       (g)(4)

      (a)(9)       (b)(5)       (d)(8)       (g)(1)

      (a)(10)       (b)(6) Inpt. only       (d)(9)  Optional       (g)(2)

      (a)(13)       (c)(2)       (e)(3) Amb. only

        No inherited certification

      (a)(11)       (b)(7)       (e)(1)

©2016 InfoGard. May be reproduced only in its original entirety, without revision 2

Page 3: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Part 3: NVLAP‐Accredited Testing Laboratory Information

September 9 through December 2, 2014

3.1 NVLAP‐Accredited Testing Laboratory Information

3.2  Test Information

3.2.1 Additional Software Relied Upon for Certification

        No additional software required

Report Number:  14‐2412‐R‐0077 Version 1.6

Address: 709 Fiero Lane Suite 25

San Luis Obispo, CA 93401

Website: www.infogard.com

Email: [email protected]

Test Date(s): 

Location of Testing: InfoGard and Vendor Site

ATL Name: InfoGard Laboratories, Inc.

Accreditation Number: NVLAP Lab Code 100432‐0

Milton Padilla EHR Test Body Manager 

ONC‐ACB Authorized Representative Function/Title

Signature and Date

Additional Software Applicable CriteriaFunctionality provided by 

Additional Software

Phone: (805) 783‐0810

ATL Contact: Milton Padilla

For more information on scope of accreditation, please reference 

http://ts.nist.gov/Standards/scopes/1004320.htm

Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory 

Representative:

2/28/2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 3

Page 4: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3.2.2 Test Tools

Version

2.4.1

1.0.4

n/a

n/a

n/a

1.7.0

1.7.0

179

3.0.2

        No test tools required

3.2.3 Test Data

3.2.4 Standards

3.2.4.1 Multiple Standards Permitted

HL7 CDA Cancer Registry Reporting Validation Tool

HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool

HL7 v2 Immunization Information System (IIS) Reporting Valdiation T

HL7 v2 Laboratory Restults Intervace (LRI) Validation Tool

HL7 v2 Syndromic Surveillance Reporting Validation Tool

Transport Testing Tool

Test Tool

Cypress

ePrescribing Validation Tool

(a)(8)(ii)(A)(2)

      §170.204(b)(1)

HL7 Version 3 Implementation 

Guide: URL‐Based 

Implementations of the Context‐

Aware Information Retrieval 

(Infobutton) Domain

      §170.204(b)(2)

HL7 Version 3 Implementation 

Guide: Context‐Aware Knowledge 

Retrieval (Infobutton) Service‐

Oriented Architecture 

Implementation Guide

(a)(13)

      §170.207(a)(3)

IHTSDO SNOMED CT® 

International Release July 2012 

and US Extension to SNOMED 

CT® March 2012 Release

      §170.207(j)

HL7 Version 3 Standard: Clinical 

Genomics; Pedigree

Direct Certificate Discovery Tool

        Alteration (customization) to the test data was necessary and is described in Appendix A

        No alteration (customization) to the test data was necessary

The following identifies the standard(s) that has been successfully 

tested where more than one standard is permitted

Criterion # Standard Successfully Tested

(a)(15)(i)

      §170.204(b)(1) 

HL7 Version 3 Implementation 

Guide: URL‐Based 

Implementations of the Context‐

Aware Information Retrieval 

(Infobutton) Domain

      §170.204(b)(2)

HL7 Version 3 Implementation 

Guide: Context‐Aware Knowledge 

Retrieval (Infobutton) Service‐

Oriented Architecture 

Implementation Guide

(a)(16)(ii)

      §170.210(g) 

Network Time Protocol Version 3 

(RFC 1305) 

      §170. 210(g)

Network Time Protocol Version 4 

(RFC 5905)

©2016 InfoGard. May be reproduced only in its original entirety, without revision 4

Page 5: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

        None of the criteria and corresponding standards listed above are applicable

3.2.4.2 Newer Versions of Standards 

        No newer version of a minimum standard was tested

3.2.5 Optional Functionality

(b)(2)(i)(A)

      §170.207(i) 

The code set specified at 45 CFR 

162.1002(c)(2) (ICD‐10‐CM) for 

the indicated conditions 

      §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

(b)(7)(i)

      §170.207(i) 

The code set specified at 45 CFR 

162.1002(c)(2) (ICD‐10‐CM) for 

the indicated conditions 

      §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

(e)(3)(ii)       Annex A of the FIPS Publication 140‐2

AES‐128‐CBC with SHA‐1

Common MU Data 

Set (15)

      §170.207(a)(3)

IHTSDO SNOMED CT® 

International Release July 2012 

and US Extension to SNOMED 

CT® March 2012 Release

      §170.207(b)(2)

The code set specified at 45 CFR 

162.1002(a)(5) (HCPCS and CPT‐4)

(e)(1)(i)       Annex A of the FIPS Publication 140‐2

AES‐128‐CBC with SHA‐1

(e)(1)(ii)(A)(2)

      §170.210(g) 

Network Time Protocol Version 3 

(RFC 1305) 

      §170. 210(g)

Network Time Protocol Version 4 

(RFC 5905)

(a)(4)(iii)       Plot and display growth charts

(b)(1)(i)(B)

      Receive summary care record using the standards 

specified at §170.202(a) and (b) (Direct and XDM 

Validation)

(b)(1)(i)(C)        Receive summary care record using the standards 

specified at §170.202(b) and (c) (SOAP Protocols)

The following identifies the newer version of a minimum standard(s) that has been 

successfully tested 

Newer Version Applicable Criteria

Criterion # Optional Functionality Successfully Tested

(b)(2)(ii)(B)

      Transmit health information to a Third Party using the 

standards specified at §170.202(a) and (b) (Direct and XDM 

Validation)

(b)(2)(ii)(C)        Transmit health information to a Third Party using the 

standards specified at §170.202(b) and (c) (SOAP Protocols)

©2016 InfoGard. May be reproduced only in its original entirety, without revision 5

Page 6: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

      No optional functionality tested

3.2.6 2014 Edition Certification Criteria* Successfully Tested

TP** TD*** TP** TD***

1.7.1 2.4.1

1.2

1.2 1.4 1.5

1.4 1.3 1.3

1.4 1.3 1.2

1.2 1.2

1.3 1.3

1.2 1.4

1.3 1.8 1.5

1.3 1.2 1.6

1.2 1.3

1.2

1.5 1.3 1.3.0

1.3 1.3.0

1.7 1.4

1.4 1.6

1.4 1.2

1.3 1.4

1.4 1.7.0

1.8a 2.0

1.4 1.7 1.3

1.7.1 2.4.1 1.2

1.7.1 2.4.1

Common MU Data 

Set (15) 

      Express Procedures according to the standard specified 

at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental 

Procedures and Nomenclature)

Common MU Data 

Set (15)      Express Procedures according to the standard specified 

at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD‐10‐PCS)

Criteria #Version

Criteria #Version

(f)(3)

     Ambulatory setting only – Create syndrome‐based 

public health surveillance information for transmission 

using the standard specified at §170.205(d)(3) (urgent care 

visit scenario)

      (a)(4)       (d)(3)

      (a)(5)       (d)(4)

      (a)(6)       (d)(5)

      (a)(1)       (c)(3)

      (a)(2)       (d)(1)

      (a)(3)       (d)(2)

      (a)(10)       (d)(9)  Optional

      (a)(11)       (e)(1)

      (a)(12)       (e)(2) Amb. only

      (a)(7)       (d)(6)

      (a)(8)       (d)(7)

      (a)(9)       (d)(8)

      (a)(16)  Inpt. only       (f)(3)

      (a)(17) Inpt. only       (f)(4) Inpt. only

      (b)(1)       (f)(5) Optional &                           Amb. only

      (a)(13)       (e)(3) Amb. only

      (a)(14)       (f)(1)

      (a)(15)       (f)(2)

      (b)(5)(A)       (g)(1)

      (b)(6) Inpt. only       (g)(2)

      (b)(7)       (g)(3)

      (b)(2)

      (b)(3)       (f)(6) Optional &                          Amb. only      (b)(4)

***Indicates the version number for the Test Data (TD)

      (c)(1)       (g)(4)

      (c)(2)

*For a list of the 2014 Edition Certification Criteria, please reference 

http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)

**Indicates the version number for the Test Procedure (TP)

©2016 InfoGard. May be reproduced only in its original entirety, without revision 6

Page 7: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3.2.7 2014 Clinical Quality Measures*Type of Clinical Quality Measures Successfully Tested:

CMS ID Version CMS ID Version CMS ID Version CMS ID Version

2 90 136 155

22 117 v2 137 156

50 122 138 v2 157

52 123 v2 139 v2 158

56 124 140 159

61 125 141 160

62 126 142 161

64 127 v2 143 163

65 128 144 164

66 129 145 165 v2

68 v3 130 146 166 v3

69 v2 131 147 v2 167

74 132 148 169

75 133 149 177

77 134 153 179

82 135 154 182

CMS ID Version CMS ID Version CMS ID Version CMS ID Version

9 71 107 172

26 72 108 178

30 73 109 185

31 91 110 188

32 100 111 190

53 102 113

55 104 114

60 105 171

        Ambulatory

        Inpatient

        No CQMs tested

*For a list of the 2014 Clinical Quality Measures, please reference 

http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

Ambulatory CQMs

Inpatient CQMs

©2016 InfoGard. May be reproduced only in its original entirety, without revision 7

Page 8: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3.2.8  Automated Numerator Recording and Measure Calculation

3.2.8.1 Automated Numerator Recording

        Automated Numerator Recording was not tested 

3.2.8.2 Automated Measure Calculation

        Automated Measure Calculation was not tested 

3.2.9 Attestation

(a)(3) (a)(11) (a)(17) (e)(1)

(a)(4) (a)(12) (b)(2) (e)(2)

Automated Numerator Recording Successfully Tested

(a)(1) (a)(9) (a)(16) (b)(6)

(a)(7) (a)(15) (b)(5)

Automated Numerator Recording Successfully Tested

(a)(1) (a)(9) (a)(16) (b)(6)

(a)(5) (a)(13) (b)(3) (e)(3)

(a)(6) (a)(14) (b)(4)

(a)(5) (a)(13) (b)(3) (e)(3)

(a)(6) (a)(14) (b)(4)

(a)(3) (a)(11) (a)(17) (e)(1)

(a)(4) (a)(12) (b)(2) (e)(2)

        Quality Management System** C

        Privacy and Security D

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), 

(b)(4)

**Required for every EHR product

(a)(7) (a)(15) (b)(5)

Attestation Forms (as applicable) Appendix

        Safety‐Enhanced Design* B

©2016 InfoGard. May be reproduced only in its original entirety, without revision 8

Page 9: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Criteria

b(1)

b(2)

b(4)

b(7)

e(1) Determined that modified Test Data had equivalent level of robustness to NIST Test Data

Appendix A: Alteration of Test Data

Explanation

Determined that modified Test Data had equivalent level of robustness to NIST Test Data

Determined that modified Test Data had equivalent level of robustness to NIST Test Data

Determined that modified Test Data had equivalent level of robustness to NIST Test Data

Determined that modified Test Data had equivalent level of robustness to NIST Test Data

©2016 InfoGard. May be reproduced only in its original entirety, without revision 9

Page 10: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Appendix B: Safety Enhanced Design

©2016 InfoGard. May be reproduced only in its original entirety, without revision 10

Page 11: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 9 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 12: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Select the patient

. • Enter and store orders for medications • Enter and store orders for laboratory

• Enter and store orders for radiology/imaging

• Modify orders for medications

• Modify orders for Laboratory

• Modify orders for radiology/imaging.

• Access orders for medications, laboratory, and radiology/imaging in an ambulatory setting.

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and

sign an informed consent/release form (included in Appendix 3); they

were instructed that they could withdraw at any time. Participants had prior experience with the EHR.4

The administrator

Page 13: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

introduced the test, and instructed participants to complete a series of

tasks (given one at a time) using the EHRUT. During the testing, the

administrator timed the test and, along with the data logger(s) recorded

user performance data on paper and electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 14: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store orders for medications.

3 100% 1.37 (11/8) 110s 1.66 (150s/90s) 0% 3

Enter and store orders for laboratory.

2 100% 1 (3/3) 46s 1.5 (46s/30s) 0% 4

Enter and store orders for radiology/imaging.

2 100% 1 (3/3) 41s 1.36 (41s/30s) 0% 4

Modify orders for medications.

3 100% 1.5 (9/6) 40s 1.7 (40s/23s) 0% 3

Modify orders for laboratory.

2 100% 1 (3/3) 37s 1.02 (37s/36s) 0% 4

Modify orders for radiology/imaging.

2 100% 1 (3/3) 37s 1.02 (37s/36s) 0% 4

Access orders for medications, laboratory, and radiology/imaging in an ambulatory setting.

1 100% 1.2 (6/5) 40s 1.8 (40s/22s) 0% 5

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 82.5.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

Page 15: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

navigate the user interface within the acceptable number of

steps.

Page 16: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

Page 17: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

demographics, professional experience, computing experience and

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

Page 18: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

benchmark current usability, but also to identify areas where

improvements must be made.

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Select the patient .

• Enter and store orders for medications • Enter and store orders for laboratory

• Enter and store orders for radiology/imaging

• Modify orders for medications

• Modify orders for Laboratory

• Modify orders for radiology/imaging.

Page 19: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

• Access orders for medications, laboratory, and radiology/imaging in an ambulatory setting.

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

4. PROCEDURES

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an

Page 20: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 21: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings.

Page 22: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

The application was set up by the vendor according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in

Page 23: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

Page 24: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 25: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 26: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 27: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store orders for medications.

3 100% 1.37 (11/8) 110s 1.66 (150s/90s) 0% 3

Enter and store orders for laboratory.

2 100% 1 (3/3) 46s 1.5 (46s/30s) 0% 4

Enter and store orders for radiology/imaging.

2 100% 1 (3/3) 41s 1.36 (41s/30s) 0% 4

Modify orders for medications.

3 100% 1.5 (9/6) 40s 1.7 (40s/23s) 0% 3

Modify orders for laboratory.

2 100% 1 (3/3) 37s 1.02 (37s/36s) 0% 4

Modify orders for radiology/imaging.

2 100% 1 (3/3) 37s 1.02 (37s/36s) 0% 4

Access orders for medications, laboratory, and radiology/imaging in an ambulatory setting.

1 100% 1.2 (6/5) 40s 1.8 (40s/22s) 0% 5

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 82.5.

Page 28: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 29: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 30: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 31: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 9 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 32: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Select the patient

. • Enter allergies and medication orders for medications

• Verify drug-drug and drug-allergy contradictions generated automatically

• Adjust severity level of interventions

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and sign an informed consent/release form

(included in Appendix 3); they were instructed that they could withdraw at any time.

Participants had prior experience with the EHR.

Page 33: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHRUT.

During the testing, the administrator timed the test and, along with the

data logger(s) recorded user performance data on paper and

electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 34: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter allergies and medication orders for medications.

3 100% 1.18 (13/11) 97s 1.2 (97s/80s) 0% 3

Verify drug-drug and drug-allergy contradictions generated automatically

1 100% 1 (1/1) 7s 1.4 (7s/5s) 0% 4

Adjust severity level of interactions

3 100% 1.14 (8/7) 41s 1.07 (41s/38s) 0% 4

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 80.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

Page 35: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

navigate the user interface within the acceptable number of

steps.

Page 36: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

Page 37: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

demographics, professional experience, computing experience and

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

Page 38: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

benchmark current usability, but also to identify areas where

improvements must be made.

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Select the patient .

• Enter allergies and medication orders for medications

• Verify drug-drug and drug-allergy contradictions generated automatically

• Adjust severity level of interventions

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

Page 39: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

4. PROCEDURES

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an

Page 40: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 41: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings.

Page 42: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

The application was set up by the vendor according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in

Page 43: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

Page 44: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 45: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 46: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 47: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter allergies and medication orders for medications.

3 100% 1.18 (13/11) 97s 1.2 (97s/80s) 0% 3

Verify drug-drug and drug-allergy contradictions generated automatically

1 100% 1 (1/1) 7s 1.4 (7s/5s) 0% 4

Adjust severity level of interactions

3 100% 1.14 (8/7) 41s 1.07 (41s/38s) 0% 4

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 80.

Page 48: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 49: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 50: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 51: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 8 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 52: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Select the patient

. • Enter and store orders for medications

• Modify orders for medications

• Access orders for medications.

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and

sign an informed consent/release form (included in Appendix 3); they

were instructed that they could withdraw at any time. Participants had prior experience with the EHR.

Page 53: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHRUT.

During the testing, the administrator timed the test and, along with the

data logger(s) recorded user performance data on paper and

electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 54: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store orders for medications.

1 100% 1 (8/8) 59s 1.4 (59s/42s) 0% 3

Modify orders for medications.

1 100% 1.5 (6/4) 61s 1.9 (61s/32s) 0% 3

Access orders for medications.

1 100% 1.5 (3/2) 24s 1.7 (24s/14s) 0% 5

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 82.5.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

navigate the user interface within the acceptable number of

steps.

Page 55: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

Page 56: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

demographics, professional experience, computing experience and

Page 57: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

benchmark current usability, but also to identify areas where

improvements must be made.

Page 58: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Select the patient .

• Enter and store orders for medications

• Modify orders for medications

• Access orders for medications.

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

4. PROCEDURES

Page 59: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an

Page 60: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 61: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings. The application was set up by the vendor

Page 62: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Page 63: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 64: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

Page 65: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 66: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 67: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store orders for medications.

1 100% 1 (8/8) 59s 1.4 (59s/42s) 0% 3

Modify orders for medications.

1 100% 1.5 (6/4) 61s 1.9 (61s/32s) 0% 3

Access orders for medications.

1 100% 1.5 (3/2) 24s 1.7 (24s/14s) 0% 5

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 82.5.

Page 68: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 69: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 70: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 71: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 8 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 72: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Select the patient

. • Enter and store allergies

• Modify stored allergies

• Access modified allergies.

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and

sign an informed consent/release form (included in Appendix 3); they

were instructed that they could withdraw at any time. Participants had prior experience with the EHR.

Page 73: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHRUT.

During the testing, the administrator timed the test and, along with the

data logger(s) recorded user performance data on paper and

electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 74: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store allergies.

1 100% 1.6 (8/5) 45s 1.9 (45s/23s) 0% 3

Modify recorded allergies.

1 100% 1 (6/6) 41s 1.8 (41s/22s) 0% 3

Access modified allergies.

1 100% 1 (2/2) 18s 1.3 (18s/13s) 0% 5

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 82.5.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

navigate the user interface within the acceptable number of

steps.

Page 75: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

Page 76: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

demographics, professional experience, computing experience and

Page 77: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

benchmark current usability, but also to identify areas where

improvements must be made.

Page 78: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Select the patient .

• Enter and store allergies

• Modify stored allergies

• Access modified allergies.

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

4. PROCEDURES

Page 79: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an

Page 80: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 81: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings. The application was set up by the vendor

Page 82: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in

Page 83: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

Page 84: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 85: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 86: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 87: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store allergies.

1 100% 1.6 (8/5) 45s 1.9 (45s/23s) 0% 3

Modify recorded allergies.

1 100% 1 (6/6) 41s 1.8 (41s/22s) 0% 3

Access modified allergies.

1 100% 1 (2/2) 18s 1.3 (18s/13s) 0% 5

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 82.5.

Page 88: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 89: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 90: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 91: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 9 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 92: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Activate/Inactivate CDS (Clinical Decision Support)

• Configure/Enable clinical support decision for a particular system user.

. • Select the patient

• Enter problem, allergy, medication, lab and vital signs and store data into EHR.

• Verify that CDS interventions are triggered automatically in the EHR.

• Access diagnostic and therapeutic reference information from the EHR.

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and

sign an informed consent/release form (included in Appendix 3); they

were instructed that they could withdraw at any time. Participants had prior experience with the EHR.

Page 93: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHRUT.

During the testing, the administrator timed the test and, along with the

data logger(s) recorded user performance data on paper and

electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 94: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Activate/Inactivate CDS Rule (Clinical Decision Support Rule)

1 100% 1 (4/4) 15s 1.25 (15s/12s) 0% 5

Enable/Disable clinical support decision for a particular system user role.

1 100% 1 (5/5) 17s 1.13 (17s/15s) 0% 5

Select the patient 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter problem, allergy, medication, lab and vital signs and store data into EHR.

1 100% 1.7 (25/14) 261s 1.66 (261s/157s) 0% 3

Verify that CDS interventions are triggered automatically in the EHR.

1 100% 1 (1/1) 5s 1 (5s/5s) 0% 5

Access diagnostic and therapeutic reference information from the EHR.

1 100% 1 (1/1) 5s 1 (5s/5s) 0% 5

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 77.5.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

Page 95: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

navigate the user interface within the acceptable number of

steps.

Page 96: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

demographics, professional experience, computing experience and

Page 97: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

benchmark current usability, but also to identify areas where

improvements must be made.

Page 98: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Activate/Inactivate CDS (Clinical Decision Support)

• Configure/Enable clinical support decision for a particular system user. .

• Select the patient

• Enter problem, allergy, medication, lab and vital signs and store data into EHR.

• Verify that CDS interventions are triggered automatically in the EHR.

• Access diagnostic and therapeutic reference information from the EHR.

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

Page 99: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

4. PROCEDURES

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an

Page 100: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 101: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings. The application was set up by the vendor

Page 102: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Page 103: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 104: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

Page 105: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 106: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 107: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Activate/Inactivate CDS Rule (Clinical Decision Support Rule)

1 100% 1 (4/4) 15s 1.25 (15s/12s) 0% 5

Enable/Disable clinical support decision for a particular system user role.

1 100% 1 (5/5) 17s 1.13 (17s/15s) 0% 5

Select the patient 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter problem, allergy, medication, lab and vital signs and store data into EHR.

1 100% 1.7 (25/14) 261s 1.66 (261s/157s) 0% 3

Verify that CDS interventions are triggered automatically in the EHR.

1 100% 1 (1/1) 5s 1 (5s/5s) 0% 5

Access diagnostic and therapeutic reference information from the EHR.

1 100% 1 (1/1) 5s 1 (5s/5s) 0% 5

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 77.5.

Page 108: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 109: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 110: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 111: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 8 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 112: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Select the patient

. • Enter and store orders for prescriptions

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and

sign an informed consent/release form (included in Appendix 3); they

were instructed that they could withdraw at any time. Participants had prior experience with the EHR.

Page 113: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHRUT.

During the testing, the administrator timed the test and, along with the

data logger(s) recorded user performance data on paper and

electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 114: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store orders for prescriptions.

1 100% 1.3 (13/10) 59s 1.2 (71s/56s) 0% 3

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 82.5.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

navigate the user interface within the acceptable number of

steps.

Page 115: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

Page 116: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

demographics, professional experience, computing experience and

Page 117: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

benchmark current usability, but also to identify areas where

improvements must be made.

Page 118: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Select the patient .

• Enter and store orders for prescriptions.

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

4. PROCEDURES

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then

Page 119: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

assigned a participant ID. 7 Each participant reviewed and signed an

Page 120: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 121: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings. The application was set up by the vendor

Page 122: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Page 123: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 124: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

Page 125: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 126: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 127: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 Enter and store orders for prescriptions.

1 100% 1.3 (13/10) 59s 1.2 (71s/56s) 0% 3

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 82.5.

Page 128: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 129: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 130: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 131: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

1

EHR Usability Test Report of PracticeSuite, version EHR 17.0.0

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports NISTIR 7742

Product: PracticeSuite Version: EHR 17.0.0 Adopted UCD Standard Name: NISTIR 7741 Adopted UCD Standard Description: This standard provides NIST guidance for those developing electronic health record (EHR) applications who need to know more about processes of user centered design (UCD). UCD ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building a usable user interface and a better user experience.

Adopted UCD Standard Citation URL: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Date of Usability Test: 0 8 / 2 4 / 2 0 1 4 Date of Report: 08/24/2014 Report Prepared By: PracticeSuite Inc.

Deepesh Damodaran, Project Manager Phone: +1-510-284-2423 E-mail: [email protected] Address: 37600 Central Court Suite #260 Newark, CA 94560

1. EXECUTIVE SUMMARY ......................................................................................................................................... 2 2. INTRODUCTION ...................................................................................................................................................... 6 3. METHOD ................................................................................................................................................................... 6

1. PARTICIPANTS ................................................................................................................................................... 6 2. STUDY DESIGN ................................................................................................................................................... 7 3. TASKS ................................................................................................................................................................... 8 4. PROCEDURES ...................................................................................................................................................... 8 5. TEST LOCATION ............................................................................................................................................... 11 6. TEST ENVIRONMENT ...................................................................................................................................... 11 7. TEST FORMS AND TOOLS .............................................................................................................................. 12 8. PARTICIPANT INSTRUCTIONS ...................................................................................................................... 12 9. USABILITY METRICS ....................................................................................................................................... 13

4. DATA SCORING ..................................................................................................................................................... 15 5. RESULTS ................................................................................................................................................................. 16

1. DATA ANALYSIS AND REPORTING ............................................................................................................. 16 6. MAJOR FINDINGS ................................................................................................................................................. 18 7. AREAS OF IMPROVEMENT ................................................................................................................................. 18

Page 132: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

2

1. EXECUTIVE SUMMARY

A usability test of PracticeSuite, version EHR 17.0.0 ambulatory software was

conducted on 08/24/2014 by PracticeSuite Inc. The purpose of this test was to test and

validate the usability of the current user interface,

and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 1 healthcare provider

matching the target demographic criteria served as participant

and used the EHRUT in simulated, but representative tasks.

This study collected performance data on 4 tasks typically conducted

on an EHR:

• Select the patient

. • View and reconcile problem list

• View and reconcile allergy list

• View and reconcile medication list

During the 30 minute one-on-one usability test, each participants was greeted by the

administrator and asked to review and

sign an informed consent/release form (included in Appendix 3); they

were instructed that they could withdraw at any time. Participants had prior experience with the EHR.

Page 133: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

3

The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHRUT.

During the testing, the administrator timed the test and, along with the

data logger(s) recorded user performance data on paper and

electronically. The administrator

did not give the participant assistance in how to complete the task.

Participant screens, head shots and audio were recorded for subsequent

analysis.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations

• Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be

made from the identity of the participant to the data collected. Following

the conclusion of the testing, participants were asked to complete a post-

test questionnaire. Various recommended metrics, in accordance with the examples set

forth in the NIST Guide to the Processes Approach for Improving the

Usability of Electronic Health Records, were used to evaluate the

usability of the EHRUT. Following is

a summary of the performance and rating data collected on the EHRUT.

Page 134: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

4

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient. 1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5 View and reconcile problem list

1 100% 1 (5/5) 35s 1.15 (35s/30s) 0% 3

View and reconcile allergy list

1 100% 1 (5/5) 37s 1.15 (37s/32s) 0% 3

View and reconcile medication list

1 100% 1 (6/6) 60s 1.4 (60s/41s) 0% 5

The results from the System Usability Scale scored the subjective

satisfaction with the system based on performance with these tasks to

be: 82.5.

In addition to the performance data, the following qualitative observations

were made:

- Major findings

• Every task tested and measured using summative testing

methods, were completed by the participants within the

allocated target task time.

• The participants were able to navigate the user interface to

accomplish the listed tasks without many erroneous detours or

deviations from the optimal path.

- Areas for improvement

• The number of steps needed to create or enter a medication

order is still excessive, even though the participants were able to

navigate the user interface within the acceptable number of

steps.

Page 135: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

5

Page 136: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

6

2. INTRODUCTION

The EHRUT tested for this study was PracticeSuite, version EHR 17.0.0 ambulatory

software. Designed to present medical information to healthcare

providers in private practices, the EHRUT consists of

practice management, EHR and medical billing software . The usability

testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the

current user interface, and provide evidence of usability in the EHR

Under Test (EHRUT). . To this end, measures of effectiveness,

efficiency and user satisfaction, such as time on task, path deviation, errors , were captured during the usability testing.

3. METHOD

1. PARTICIPANTS

A total of 1 participant were tested on the EHRUT(s). Participants in

the test were providers from private practices. In addition, participants had no direct connection to the development of

or organization producing the EHRUT(s). Participants were not from the

testing or supplier organization. Participants were given the opportunity

to have the same orientation and level of training as the actual end users

would have received.

Recruited participants had a mix of backgrounds and demographic

characteristics conforming to the recruitment screener. The

following is a table of participants by characteristics, including

demographics, professional experience, computing experience and

Page 137: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

7

user needs for assistive technology. Participant names were

replaced with Participant IDs so that an individual’s data cannot be

tied back to individual

identities.

Part ID

Gender

Age

Education

Occupation/

role

Professional Experience

Computer Experience

Product

Experience

Assistive Technology

Needs 1 P01 Male 47 MD Provider 15 years Intermediate Intermediate NA 2 N

1 participant (matching the

demographics in the section on Participants) were recruited and participated in the usability test.

Participants were scheduled for 30 minute sessions with

5 minutes in between each session for debrief by the

administrator(s) and data logger(s), and to reset systems to proper

test conditions. A spreadsheet was used to keep track of the

participant schedule, and included each participant’s demographic.

2. STUDY DESIGN

Overall, the objective of this test was to uncover areas where the

application performed well – that is, effectively, efficiently, and with

satisfaction – and areas where the application failed to meet the

needs of the participants. The data from this test may serve as a

baseline for future tests with an updated version of the same EHR

and/or comparison with other EHRs provided the same tasks are

used. In short, this testing serves as both a means to record or

benchmark current usability, but also to identify areas where

improvements must be made.

Page 138: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

8

During the usability test, participants interacted with 1 EHR. Each

participant used the system in the same location, and was provided with

the same instructions. The system was evaluated for effectiveness,

efficiency and satisfaction as defined by measures collected and

analyzed for each participant:

• Number of tasks successfully completed within the allotted time

without assistance

• Time to complete the tasks

• Number and types of errors

• Path deviations

• Participant’s verbalizations (comments)

• Participant’s satisfaction ratings of the system

Additional information about the various measures can be found in

Section 3.9 on Usability Metrics.

3. TASKS

A number of tasks were constructed that would be realistic and

representative of the kinds of activities a user might do with this EHR,

including:

• Select the patient .

• View and reconcile problem list

• View and reconcile allergy list

• View and reconcile medication list

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.6

Tasks

should always be constructed in light of the study objectives.

4. PROCEDURES

Page 139: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

9

Upon arrival, participants were greeted; their identity was verified and

matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an

Page 140: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

10

informed consent and release form (See Appendix 3). A representative

from the test team witnessed the participant’s signature.

To ensure that the test ran smoothly, two staff members participated in

this test, the usability administrator and the data logger. The usability

testing staff conducting the test was experienced usability practitioners.

The administrator moderated the session including administering

instructions and tasks. The administrator also monitored task times,

obtained post-task rating data, and took notes on participant comments.

A second person served as the data logger and took notes on task

success, path deviations, number and type of errors, and comments.

Participants were instructed to perform the tasks (see specific

instructions below):

• As quickly as possible making as few errors and deviations as

possible.

• Without assistance; administrators were allowed to give

immaterial guidance and clarification on tasks, but not

instructions on use.

• Without using a think aloud technique.

For each task, the participants were given a written copy of the task.

Task timing began once the administrator finished reading the question.

The task time was stopped once the participant indicated they had

successfully completed the task. Scoring is discussed below in Section

3.9.

Following the session, the administrator gave the participant the post-test

questionnaire (e.g., the System Usability Scale, see Appendix 5),

Page 141: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

11

compensated them for their time, and thanked each individual for their

participation.

Participants' demographic information, task success rate, time on task,

errors, deviations, verbal responses, and post-test questionnaire were

recorded into a spreadsheet.

Participants were thanked for their time.

5. TEST LOCATION

The test facility included a waiting area and a quiet testing room with a

table, computer for the participant, and recording computer for the

administrator. Only the participant and administrator were in the test

room. All observers and the data logger worked from a separate room

where they could see the participant’s screen and face shot, and listen to

the audio of the session. To ensure that the environment was

comfortable for users, noise levels were kept to a minimum with the

ambient temperature within a normal range. All of the safety instruction

and evacuation procedures were valid, in place, and visible to the

participants.

6. TEST ENVIRONMENT

The EHRUT would be typically be used in a healthcare office or facility.

In this instance, the testing was conducted in PracticeSuite Inc. office. For

testing, the computer used a laptop running Windows 7 operating system.

The participants used mouse and keyboard when interacting with the EHRUT.

The PracticeSuite, EHR 17.0.0 used 15” LCD screen with resolution 1600x900 and True Color 32-bit color settings. The application was set up by the vendor

Page 142: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

12

according to the vendor’s documentation describing the

system set-up and preparation. The application itself was running on a

on the cloud using a test database. Technically, the system performance

(i.e., response time) was representative to what actual users would

experience in a field implementation. Additionally, participants were

instructed not to change any of the default system settings (such as

control of font size).

7. TEST FORMS AND TOOLS

During the usability test, various documents and instruments were used,

including:

1. Informed Consent 2. Moderator’s Guide 3. Post-test Questionnaire 4. Acknowledgment Form

The Moderator’s Guide was devised so as to be able to capture

required data.

8. PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each

participant (also see the full moderator’s guide in Appendix [B4]):

Thank you for participating in this study. Your input is very important. Our session today will last about 30 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in

Page 143: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

13

case you need specific help, but I am not able to instruct you or provide help in how to use the application.

Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR

and as their first task, were given time (10 minutes) to explore the

system and make comments. Once this task was complete, the

administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done.

Participants were then given 7 tasks to complete.

9. USABILITY METRICS

According to the NIST Guide to the Processes Approach for Improving

the Usability of Electronic Health Records, EHRs should support a

process that provides a high level of usability for all users. The goal is for

users to interact with the system effectively, efficiently, and with an

acceptable level of satisfaction. To this end, metrics for effectiveness,

efficiency and user satisfaction were captured during the usability testing.

The goals of the test were to assess:

1. Effectiveness of PracticeSuite, version EHR 17.0.0 by measuring participant success rates and errors

2. Efficiency of PracticeSuite, version EHR 17.0.0 by measuring the average task time and path deviations

Page 144: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

14

3. Satisfaction with PracticeSuite, version EHR 17.0.0 by measuring ease of use ratings

Page 145: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

15

4. DATA SCORING

The following table (Table [x]) details how tasks were scored, errors

evaluated, and the time data analyzed.

Measures Rationale and Scoring Effectiveness:

Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.

The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by a factor of 2 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 30 seconds then allotted task time performance was 60 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness:

Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors.

The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors.11 This should also be expressed as the mean number of failed tasks per participant.

On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 146: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

16

Efficiency:

Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction:

Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12

Common convention is that average ratings for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the PracticeSuite, version EHR 17.0.0 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.”

Table [x]. Details of how observed data were scored.

5. RESULTS

1. DATA ANALYSIS AND REPORTING

The results of the usability test were calculated according to the methods

specified in the Usability Metrics section above. Participants who failed to

follow session and task instructions had their data excluded from the

analyses.

The usability testing results for the EHRUT are detailed below (see Table

[x])14.

Page 147: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

17

Measure

Task

N Task Success

Path Deviation Task Time Errors Task Ratings 5=Easy

# Mean (SD)

Deviations (Observed/Optimal)

Mean(SD) Deviations (Observed/Optimal)

Mean (SD)

Mean(SD)

Select the patient.

1 100% 1 (5/5) 20s 1.3 (20s/15s) 0% 5

View and reconcile problem list

1 100% 1 (5/5) 35s 1.15 (35s/30s) 0% 3

View and reconcile allergy list

1 100% 1 (5/5) 37s 1.15 (37s/32s) 0% 3

View and reconcile medication list

1 100% 1 (6/6) 60s 1.4 (60s/41s) 0% 5

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be 82.5.

Page 148: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

18

6. MAJOR FINDINGS

• Every task tested and measured using summative testing methods, were

completed by the participants within the allocated target task time.

• The participants were able to navigate the user interface to accomplish the listed

tasks without many erroneous detours or deviations from the optimal path.

7. AREAS OF IMPROVEMENT

• The number of steps needed to create or enter a medication order is still excessive,

even though the participants were able to navigate the user interface within the

acceptable number of steps.

Page 149: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

19

Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE

In 1996, Brooke published a “low-cost usability scale that can be used for global assessments of systemsusability” known as the System Usability Scale or SUS.16 Lewis and Sauro (2009) and others haveelaborated on the SUS over the years. Computation of the SUS score can be found in Brooke’s paper, in at http://www.usabilitynet.org/trump/documents/Suschapt.doc or in Tullis and Albert (2008).

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in

this system were well integrated

6. I thought there was too much inconsistency in this system

7. I would imagine that most people would learn to use this system very quickly

8. I found the system very

cumbersome to use

9. I felt very confident using the system

10. I needed to learn a lot of

things before I could get going with this system

Strongly Strongly disagree agree

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5 x

1 2 3 4 5

Page 150: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

20

Page 151: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

PracticeSuite Inc. Page 1

EHR Usability Test Risk Assessment Report of PracticeSuite, version EHR 17.0.0:

Risk Assessment:

Tasks in the Usability Tests Scenarios were selected based on requirements to satisfy the Safety-enhanced Design criterion (170.314.g.3) for EHR Meaningful Use (MU) certification and on good software development practices. Included in the MU requirements is that ‘User tasks employed in the study are prioritized in accordance with the risk associated with user.’ The assessment of patient safety risk resulting from the software development, and then end-user interaction with the EHR, begins during the design phase, and continues through the development phase of the EHR functionality. The Product Owner and the development scrum team assess the requirements and workflow of the desired enhancement for the potential patient safety and IT risks and actively plan in reducing these risks. PracticeSuite has defined tasks to be performed in the summative usability tests based on their frequency of use, criticality of function, and those that may be most troublesome for users. The following tables summarize the identification of potential risks and development steps taken to mitigate risk in the clinical information reconciliation processes. Product PracticeSuite, version EHR 17.0.0

Date 08/24/2014

Risk Assessment, Risk Level and Risk Mitigation.

Potential Risk Cause Probability of Occurrence

Risk Level

Risk Mitigation

Risk associated with General criteria.

User selects the wrong patient

Two or more patients exist with similar data points.

Rarely High Last Name, First Name, DOB, Phone, and Insurance along with patient Picture is available to identify the correct patient.

170.314(a)(1) Computerized provider order entry – medication orders

User enters wrong Dosage, Frequency or Route.

Mistyping of information

Rarely High Dosage, Frequency and Route are presented in pre-defined lists.

Wrong order processed

Order not verified by provider

Occasionally High Medication orders are taken to a review page before processing.

170.314(a)(1) Computerized provider order entry – laboratory orders/Radiology order

Users enter a wrong Laboratory or

Users click an incorrect

Occasionally High Lab/Radiology Orders are taken to a review page before processing.

Page 152: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

PracticeSuite Inc. Page 2

Radiology order. order from the list.

170.314(a)(2) Drug-drug, drug-allergy interaction checks

Inaccurate Allergy record entered.

Historical data not considered

Occasionally Moderate Presentation of collected patient information and imported information allow providers to reconcile a single and complete allergy list

Not able to see mild and moderate interactions.

Users update the Alert settings to show only Severe Interactions.

Rarely Moderate Only Administrative users are allowed to change interaction alert settings. Even though the screen doesn’t show the mild and moderate interactions, it indicates the existence of these interactions.

170.314(a)(6) Medication list

No additional risks were identified other than those specified in the “170.314(a)(1) Computerized provider order entry – medication orders”.

170.314(a)(7) Medication allergy list

No critical user errors were identified or observed as part of this scenario.

170.314(a)(8) Clinical decision support

Unable to view CDSS Alerts.

User Inactivated the CDSS Alerts.

Rarely Moderate

Only Administrator users are allowed to edit the CDSS settings.

CDSS rule shown to wrong user.

Users configured the CDSS for a particular user type.

Rarely Moderate

170.314(b)(3) Electronic prescribing

No critical user errors were identified or observed as part of this usability task.

170.314(b)(4) Clinical information reconciliation

Inaccurate Problem list/Allergy/Medication record entered.

Historical data not considered

Occasionally Moderate Presentation of collected patient information and imported information allow clinician to reconcile a single and complete problem/Allergy/Medication list

Page 153: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Appendix C: Quality Management System

©2016 InfoGard. May be reproduced only in its original entirety, without revision 11

Page 154: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Quality Management System Attestation Form-EHR-37-V02

InfoGard Laboratories, Inc. Page 1

For reporting information related to testing of 170.314(g)(4).

Vendor and Product Information

Vendor Name PracticeSuite, Inc

Product Name PracticeSuite

Product Version EHR-17.0.0

Quality Management System

Type of Quality Management System (QMS) used in the development, testing, implementation, and maintenance of EHR product.

Based on Industry Standard (for example ISO9001, IEC 62304, ISO 13485, etc.). Standard:

A modified or “home-grown” QMS.

No QMS was used.

Was one QMS used for all certification criteria or were multiple QMS applied?

One QMS used.

Multiple QMS used.

Description or documentation of QMS applied to each criteria:

Not Applicable.

Statement of Compliance

I, the undersigned, attest that the statements in this document are completed and accurate.

Vendor Signature by an Authorized Representative

Deepesh Damodaran

Date 10/7/2014

Page 155: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Appendix D: Privacy and Security

©2016 InfoGard. May be reproduced only in its original entirety, without revision 12

Page 156: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Privacy and Security Attestation Form-EHR-36-V03

InfoGard Laboratories, Inc. Page 1

Vendor and Product Information

Vendor Name PracticeSuite, Inc

Product Name PracticeSuite

Product Version EHR-17.0.0

Privacy and Security

170.314(d)(2) Auditable events and tamper-resistance

Not Applicable (did not test to this criteria)

Audit Log:

Cannot be disabled by any user.

Audit Log can be disabled.

The EHR enforces that the audit log is enabled by default when initially configured

Audit Log Status Indicator:

Cannot be disabled by any user.

Audit Log Status can be disabled

The EHR enforces a default audit log status. Identify the default setting (enabled or disabled): Enabled

There is no Audit Log Status Indicator because the Audit Log cannot be disabled.

Encryption Status Indicator (encryption of health information locally on end user device):

Cannot be disabled by any user.

Encryption Status Indicator can be disabled

The EHR enforces a default encryption status. Identify the default setting (enabled or disabled):

There is no Encryption Status Indicator because the EHR does not allow health information to be stored locally on end user devices.

Identify the submitted documentation that describes the inability of the EHR to allow users to disable the audit logs, the audit log status, and/or the encryption status: Document submitted "AuditLog Process.docx"

Identify the submitted documentation that describes the method(s) by which the EHR protects 1) recording of actions related to electronic health information, 2) recording of audit log status, and 3) recording of encryption status from being changed, overwritten, or deleted by the EHR technology: Document submitted "AuditLog Process.docx"

Page 157: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Privacy and Security Attestation Form-EHR-36-V03

InfoGard Laboratories, Inc. Page 2

Identify the submitted documentation that describes the method(s) by which the EHR technology detects whether the audit log has been altered: Document submitted "AuditLog Process.docx"

170.314(d)(7) End-user device encryption

Storing electronic health information locally on end-user devices (i.e. temp files, cookies, or other types of cache approaches).

Not Applicable (did not test to this criteria)

The EHR does not allow health information to be stored locally on end-user devices.

Identify the submitted documentation that describes the functionality used to prevent health information from being stored locally: Document submitted "PracticeSuite_FIPS_140-2_Desc.docx"

The EHR does allow health information to be stored locally on end user devices.

Identify the FIPS 140-2 approved algorithm used for encryption:

Identify the submitted documentation that describes how health information is encrypted when stored locally on end-user devices:

The EHR enforces default configuration settings that either enforces the encryption of locally stored health information or prevents health information from being stored locally.

Identify the default setting:

170.314(d)(8) Integrity

Not Applicable (did not test to this criteria)

Identify the hashing algorithm used for integrity (SHA-1 or higher): SHA-1

170.314(e)(1) View, Download, and Transmit to 3rd Party

Not Applicable (did not test to this criteria)

Identify the FIPS 140-2 approved algorithm used for encryption:

encrypted using 128-bit symmetric key algorithm AES_128_CBC

170.314(e)(3) Secure Messaging

Not Applicable (did not test to this criteria)

Identify the FIPS 140-2 approved algorithm used for encryption:

encrypted using 128-bit symmetric key algorithm AES_128_CBC

Statement of Compliance

I, the undersigned, attest that the statements in this document are accurate.

Vendor Signature by an Authorized Representative Deepesh Damodaran

Date 10/7/2014

Page 158: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Test Results Summary for 2014 Edition EHR Certification 14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary for 2014 Edition EHR Certification

14‐2412‐R‐0077‐PRA Version 1.1, February 28, 2016

Test Results Summary Document History Version Date

V1.0 12/18/2014

V1.1 2/28/2016

END OF DOCUMENT

Description of Change

Initial release

Updated Safety‐Enhanced Design report

©2016 InfoGard. May be reproduced only in its original entirety, without revision 13