Research in Human-Computer Interaction for VA Clinical Decision Support July 15 th, 2008 Jason J....
-
Upload
clemence-chambers -
Category
Documents
-
view
214 -
download
0
Transcript of Research in Human-Computer Interaction for VA Clinical Decision Support July 15 th, 2008 Jason J....
Research in Human-Computer Interaction for VA Clinical Decision SupportJuly 15th, 2008
Jason J. Saleem, PhD
VA HSR&D Center on Implementing Evidence-Based Practice;IU Center for Health Services & Outcomes Research, Regenstrief Institute
2
Presentation Overview
Brief overview of human factors and HCI Human factors studies involving VA CPRS and
computerized clinical reminders Current work: AHRQ grant (Doebbeling, PI);
“Improving Integration of CDS into Workflow” Human-Computer Interaction (HCI) / IT Lab Questions / Discussion
3
Human Factors Engineering
Study of human physical and cognitive capabilities and limitations, and application of that knowledge to system design
Design of interfaces between people and technology Human-machine interface technology Human-environment interface technology Human-job interface technology Human-software interface technology
Human-computer interaction Human-organization interface technology
4
Common Human Factors Methods Usability study
Performance-based (time on task, error rates) Scenario-based (think aloud technique)
Simulation study Heuristic evaluation Cognitive task analysis Card sorting Ethnography / naturalistic observation Kushniruk and Patel – methods review in
Journal of Biomedical Informatics - 2004
Human Factors Studies – VA Computerized Clinical Reminders
5
Observational Field Study Barriers and facilitations to clinical reminder useSaleem JJ, Patterson ES, Militello L, Render ML, Orshansky G, Asch SM. Exploring barriers and facilitators
to the use of computerized clinical reminders. J Am Med Inform Assoc. 2005;12(4):438-47.
Follow-up Simulation Study A vs B comparison study of redesign recommendationsSaleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency,
usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc. 2007;14(5):632-40.
This work funded by:VA HSR&D Merit Review grant (TRX 02-216): “Human Factors and the Effectiveness of Computerized Clinical Reminders”Principal Investigators: Emily S. Patterson, PhD and Steven M. Asch, MD, MPH 7/03 – 6/06
Jason J. Saleem, PhD 6
Observational Field Study
Ethnographic, “naturalistic” observation Non-intrusive; shadowing of nurses and
physicians 3 observers, 4 VA hospitals, 2 days/site Capture observable activities and
verbalizations Self-report data about how artifacts (tools)
support or hinder performance Qualitative field data
Jason J. Saleem, PhD 7
Field Study Participants
Jason J. Saleem, PhD 8
Field Study Results
Barrier 1: Coordination Between Nurses and Providers
Barrier 2: Satisfying Reminders While Not With the Patient
Barrier 3: Workload Barrier 4: Lack of Flexibility Barrier 5: Poor Usability
Jason J. Saleem, PhD 9
Facilitators
Limit the number of reminders Position computer workstations
strategically Improve integration of reminders into
clinical workflow Feedback mechanism
.
10
Simulation Study A vs B comparison study 16 non-VA nurses Hypotheses: A redesigned interface
compared to the current design will:1. Have greater learnability2. Have increased efficiency3. Have a lower perceived workload 4. Have greater perceived user satisfaction
Saleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc. 2007;14(5):632-640.
11
*Fictitious patient record*
12
Design changes
13
*Fictitious patient record*
*Prototype*
14
Dependent MeasuresAttribute Measure or Tool
Learnability Time to reach proficiency
Efficiency Time to complete scenarios
Perceived workload NASA Task Load Index (TLX)
User satisfaction Usability questionnaire
15
NASA Task Load Index (TLX)
16
Results - Learnability
Time limit = 300 sec (5 min) Learnability is statistically
significant; time to satisfy a clinical reminder with design B (redesign) significantly less than time with design A (current system)
t-test t stat = 4.365 t.05, 14 = 1.761 p < 0.001
Subj A-sec B-sec1 3002 1243 3004 1745 3006 477 1868 709 30010 6811 30012 20613 30014 14015 30016 300
mean: 286 141
A = current CR designB = redesigned prototype
17
Results - Efficiency
Paired t-test revealed users completed 2 of 5 patient scenarios with significantly less time for Design B (p<0.05).
18
Results – Workload: Mental Demand Subscale
Subj A B01 88.0 69.002 51.0 32.003 47.0 61.004 27.0 13.005 21.0 11.006 64.0 75.007 11.0 5.008 32.0 33.009 65.0 43.010 16.0 17.011 35.0 15.012 42.0 44.013 27.0 15.014 50.0 58.015 51.0 46.016 6.0 10.0
Mental demand with design B rated significantly less than with design A
Means: A = 39.6, B = 34.2 Paired t test t stat = 1.840 t.05, 15 = 1.753 p = 0.043
19
Results – Workload: Frustration Subscale
Subj A B01 6.0 6.002 49.0 53.003 78.0 64.004 32.0 5.005 26.0 11.006 12.0 18.007 13.0 6.008 40.0 28.009 47.0 26.010 38.0 48.011 44.0 22.012 23.0 35.013 13.0 11.014 59.0 65.015 52.0 40.016 9.0 5.0
Frustration with design B rated significantly less than with design A
Means: A = 33.8, B = 27.7 Paired t test t stat = 2.040 t.05, 15 = 1.753 p = 0.030
20
Usability Questionnaire
Likert-type scale (1=strongly disagree; 7=strongly agree)
Sample questions: The organization of the information on the
systems screens is clear. The display layouts simplify tasks. The sequence of displays is confusing.
See Saleem et al. JAMIA. 2007;14(5):632-640 for complete results
21
Implications for Design
Modest design modifications to better integrate clinical reminders with CPRS increased time to reach proficiency in reminder use This could potentially impact the willingness
for new users to adopt and consistently use clinical reminders
22
Conclusions
Human factors methods should be routinely used to rapidly collect data to support design decisions formatively (i.e., prior to implementation) improve user performance and usability reduce cost by addressing design issues
pre-implementation This model is still not widely adopted in
healthcare
23
AHRQ Grant – Improving Integration of CDS into Workflow (Doebbeling, PI) AHRQ ACTION Collaborative Improving Quality Through Health IT (RFTO#8) CDS for Colorectal Cancer Screening Team:
Brad Doebbeling, MD, MSc David Haggstrom, MD, MAS Jason Saleem, PhD Laura Militello, MA Heather Hagg, MS Jeff Linder, MD, MPH Paul Dexter, MD Brian Dixon, MPA et al.
24
Study Objectives
1. Identify key approaches for effective integration of CDS for colorectal cancer screening into clinical workflow.
2. Test alternatives through controlled simulation analysis.
3. Evaluate improved CDS design after subsequent implementation (at a VA test-site).
25
Key Informant Interviews of site-specific best-practices for integration of colorectal cancer screening CDS into workflow
Direct Observation of colorectal cancer screening CDS for barriers and facilitators to workflow integration
Implementationin primary care clinic after simulation study
Rapid Prototyping of CDS design alternatives based on Phase 1 findings
Simulation Study to test impact of CDS design alternatives on efficiency, usability, and workload
Evaluation in primary care clinic after simulation study
Figure. Project Overview
Phase 1 Phase 2 Phase 3
26
Phase 1 - Observational Sites VAMC – West Haven, CT
CRC screening computerized clinical reminder VAMC – Columbia, SC
OncWatch CDS (clinical reminder and management tool) Divides patients (based on the data) into four different
cohorts, defined by their risk/needs Creates a “fail safe” system to identify patients and ensure
follow-up recommendations are being fulfilled Regenstrief Institute – 2 Indianapolis clinics
Encounter form reminders for CRC screening Partners Healthcare – Brigham & Women’s Hospital
Previous failed attempts to implement CRC screening CDS: “no easy way to feed back that an adequate colonoscopy was done and was normal”
New CRC screening tool being piloted this year Phases 2 & 3 – 2009
Summary Clinical reminder studies and current work with
AHRQ on CDS workflow integration: Qualitative field observation followed by scenario-
driven, comparative usability testing of experimental prototypes in a simulated setting
Clinical software development may benefit from this paradigm if more widely followed
Acknowledgments: Emily S. Patterson, PhD Steven M. Asch, MD, MPH Marta L. Render, MD Bradley N. Doebbeling, MD, MSc
27
28
Human-Computer Interaction (HCI) / IT Lab (Indianapolis VAMC) Operational – May 2008 Rapid prototyping of design alternatives Usability testing / simulation study Specialized software and data collection
equipment VA network & University network Grad students specializing in usability and
human-computer interaction
29
Perfect Timing?
Just published: Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM,
Ash JS, Campbell E, Bates DW. “Grand Challenges in Clinical Decision Support v11”, Journal of Biomedical Informatics, 2008;41(2):387-92. #1 Ranked challenge in clinical decision support:
“Improve the human-computer interface”. Paper was written to influence funders and policy makers
30
Human-Computer InteractionInformation Technology
Laboratory(HCI / IT LAB)
D-5014
VA Laptop•Morae•Snag-it
VA Desktops•Visual Basic
IU Laptop•Morae•Snag-it
IU Desktops•Visual Basic•Azure X 1PC
47 inch monitor
17” monitor & player forresearcher’s office
- VA LAN OUTLET
- IU LAN OUTLET
Portable DigiCam
w/WA adapter
31
Current Project Portfolio for HCI / IT Lab; affiliated projectsFunded: VA STP: “Factors Influencing Effective Implementation of My HealtheVet”
(Chumbler, PI) AHRQ RFTO#8 HIT: “Implementing and Improving the Integration of
Decision Support into Outpatient Clinical Workflow” (Doebbeling, PI) VA CDA: “Colorectal Cancer Survivor Surveillance Care and Personal Health
Records” (Haggstrom) Purdue– Lilly Seed Grant: “Integrating Pharmacogenomic-guided Dosing
into Clinical Practice” (Overholser, PI) Doctoral Dissertation: “Impact of Information Flow and Prioritization on the
Use of Computerized Clinical Reminders” (Wu)
In Process: VA IIR for June ‘08: “The Effects of Exam Room Computing on Patient-
Centered Communication” (Frankel, PI) VA IIR for Dec ‘08: “Barriers and Facilitators to Providers’ Adoption of My
HealtheVet” (Chumbler, PI) VA CDA or IIR for Dec ‘08: “Circumventing Health IT with Paper: Identifying
Patient Safety Risks” (Saleem)
32
Focus Areas
Rapid prototyping of CDS and other Health IT
Applied usability studies Research experiments through
simulation (as part of a broader research agenda, including field study)
Knowledge Management capabilities Prototype KM strategies
33
Goals1. Become a warehouse for user-centered design
Provide early input through rigorous human factors methods for health IT software and hardware design
2. Become a resource for healthcare informatics research Design research studies to improve health IT prior to
implementation3. Become a bridge for collaboration VHA-wide and external
to VA Partnerships with RI, IU School of Informatics, Purdue, etc. Partnership with Roudebush VA and VISN 11 CIO on local
projects; Partnership with VA Office of Information & Technology on
national-level projects Partnership with non-VA research groups on broad health IT
projects (e.g., CDS Consortium with Partners Healthcare, Regenstrief, etc.)
Actively seek strategic teaming opportunities for collaborative research to transfer knowledge out of the Lab to meet real world challenges
34
Opportunities / Partnerships
Resource for Center Investigators to conduct studies as part of an overall research proposal or CDA
Pilot studies for rapid data collection on usability and piloting of new technology to support future proposals
Demonstration of Center’s IT research capabilities to visitors / collaborators
Graduate students, internships to support work in the Lab
Potential partnership with VA Office of Information & Technology to provide input during clinical applications development, usability testing and assess impact on patients and providers
Potential partnership with Roudebush VA and VISN 11 CIO on local projects
35
Contact
Jason J. Saleem, PhD
Human Factors Engineer
VA Center on Implementing Evidence-Based Practice
IU Center for Health Services & Outcomes Research, Regenstrief Institute