By K. Sashi Rao Management Teacher and Trainer K.Sashi Rao/May 2013.
Determining the Factors Contributing to Electronic Referral … · 2013-11-21 · Arun Sashi...
Transcript of Determining the Factors Contributing to Electronic Referral … · 2013-11-21 · Arun Sashi...
DETERMINING THE FACTORS CONTRIBUTING TO ELECTRONIC REFERRAL SYSTEM ADOPTION BY RADIATION ONCOLOGISTS THROUGH USER-CENTRED DESIGN
by
Arun Sashi Chandran
A thesis submitted in conformity with the requirements for the degree of MHSc in Clinical Engineering
Graduate Department of IBBME University of Toronto
© Copyright by Arun Sashi Chandran 2013
ii
DETERMINING THE FACTORS CONTRIBUTING TO ELECTRONIC REFERRAL SYSTEM ADOPTION BY RADIATION ONCOLOGISTS THROUGH USER-CENTRED DESIGN
By
Arun Sashi Chandran
MHSc in Clinical Engineering 2013, Graduate Department of IBBME, University of Toronto
Abstract
This study aimed to utilize usability engineering methods in order to identify facilitators and barriers to
electronic referral system adoption by radiation oncologists at Princess Margaret Cancer Centre, and
provide recommendations for electronic referral system design. Analyses included workflow analysis of
radiation oncologists reviewing referrals, belief elicitation interviews with radiation oncologists, a
heuristic evaluation of an existing electronic referral system interface, and cognitive walkthrough of that
interface with radiation oncologists. Based on these findings, the system interface was redesigned using
mock-up software to address identified usability issues. The existing and redesigned interfaces were
compared using observational usability testing with radiation oncologists. The redesigned system
interface yielded reduced task times and enhanced user satisfaction as compared to the existing
interface. Thus, user-centred design was useful in determining facilitators and barriers to e-
referral adoption.
iii
Acknowledgements
Firstly, I would like to sincerely thank my supervisor Dr. Sara Urowitz for her direction and support
throughout my thesis. Thank you for your constant guidance and keeping me on track every step of the
way. Also, thank you to my co-supervisor Dr. Joesph Cafazzo for your invaluable expertise and
knowledge. I would also like to thank my additional committee member Dr. David Wiljer for initially
agreeing to accept me as a student, and believing in my abilities early on. I learned so much from all of
you over the past two years and I am truly grateful. I would also like to thank Dr. Emily Seto, who
provided her expertise as my external reviewer.
My gratitude also goes to members of the ELLICSR team, particularly Yaser Alyounes, for facilitating
access to ARMS and providing his expertise as my primary contacts with the application development
team. Thanks to him and Michael Crupi for providing their skills as usability heuristic evaluators. Thanks
also to Menaka Pulandiran for guiding me through the REB process – I would not have been able to
navigate this on my own. Thank you to Angela Dosis for acting as the Primary Investigator when
required.
I would like to thank the physicians from the Radiation Medicine Program who participated in my study,
as well as the administrative secretaries who helped me fit into their busy schedules. Thank you to
Roxana Sultan for assisting me with recruiting study participants.
I would also like to thank staff from the Centre for Global eHealth Innovation: Alvita Chan, who’s
previous thesis work aided in formulating my methodology, Stefano Gimli and Christopher Flewwelling
for helping me with usability testing software.
Thank you to the Princess Margaret Cancer Foundation for making this study possible.
Thanks to my classmates for your support and helping the last two years to fly by.
Finally, thank you to my family for your love and support.
iv
Contents
Abstract ......................................................................................................................................................... ii
Acknowledgements ...................................................................................................................................... iii
List of Tables ............................................................................................................................................... viii
List of Figures ................................................................................................................................................ ix
1 Background and Rationale.....................................................................................................................1
1.1 Medical Referrals ...........................................................................................................................1
1.1.1 Referrals and Patient-Centred Care ......................................................................................2
1.2 Inefficiencies in the Referral Process ............................................................................................2
1.3 Electronic Referrals .......................................................................................................................3
1.3.1 Electronic Referrals and Patient-Centred Care .....................................................................4
1.4 Patient Referrals at Princess Margaret Hospital ...........................................................................4
1.4.1 Ambulatory Referral Management System (ARMS) ..............................................................5
1.5 Technology Adoption ....................................................................................................................5
1.5.1 Diffusion of Innovations and Technology Acceptance Model ...............................................6
1.6 Usability Engineering and User-Centred Design............................................................................9
2 Research Question and Objectives ..................................................................................................... 11
2.1 Objectives ................................................................................................................................... 11
2.2 Thesis Statement ........................................................................................................................ 11
3 Methodology ...................................................................................................................................... 12
3.1 Workflow Observations .............................................................................................................. 12
3.2 Heuristic Evaluation of ARMs User Interface ............................................................................. 13
3.3 Interviews ................................................................................................................................... 15
3.4 Cognitive Walkthrough of ARMs User Interface ........................................................................ 16
3.5 Interface Redesign ...................................................................................................................... 17
3.6 Observational Usability Test of ARMs and Prototype User Interfaces ....................................... 18
v
4 Results ................................................................................................................................................ 21
4.1 Workflow Observations .............................................................................................................. 21
4.1.1 Workflow Mapping ............................................................................................................. 21
4.1.2 Notable Observations ......................................................................................................... 24
4.1.3 Identification of Issues........................................................................................................ 24
4.2 Heuristic Evaluation .................................................................................................................... 24
4.2.1 Classification of Usability Issues by Severity ...................................................................... 25
4.2.2 Classification of Usability Issues by Heuristics Violated ..................................................... 31
4.2.3 Classification of Usability Issues by Task ............................................................................ 32
4.3 Interviews ................................................................................................................................... 32
4.3.1 Communication .................................................................................................................. 33
4.3.2 Efficiency............................................................................................................................. 34
4.3.3 Integration .......................................................................................................................... 34
4.4 Cognitive Walkthrough ............................................................................................................... 35
4.4.1 Integration .......................................................................................................................... 36
4.4.2 Flexibility ............................................................................................................................. 36
4.4.3 Usability .............................................................................................................................. 37
4.5 System Redesign ......................................................................................................................... 38
4.5.1 Home Screen ...................................................................................................................... 39
4.5.2 Referral Details Screen ....................................................................................................... 40
4.5.3 Accept Referral Screen ....................................................................................................... 41
4.5.4 Confirmation Screen ........................................................................................................... 42
4.6 Usability Testing for Redesigned E-Referral System .................................................................. 43
4.6.1 Task Completion Times....................................................................................................... 43
4.6.2 User Satisfaction ................................................................................................................. 44
4.6.3 Other Observations ............................................................................................................ 46
vi
5 Discussion ........................................................................................................................................... 48
5.1 Workflow Observations .............................................................................................................. 48
5.1.1 Limitations .......................................................................................................................... 50
5.2 Heuristic Evaluation .................................................................................................................... 51
5.2.1 Limitations .......................................................................................................................... 52
5.3 Interviews ................................................................................................................................... 52
5.3.1 Limitations .......................................................................................................................... 56
5.4 Cognitive Walkthrough ............................................................................................................... 57
5.4.1 Limitations .......................................................................................................................... 59
5.5 Usability Testing ......................................................................................................................... 59
5.5.1 Limitations .......................................................................................................................... 63
6 Conclusion & Recommendations ....................................................................................................... 66
6.1 Recommendations ...................................................................................................................... 67
6.2 Future Work ............................................................................................................................... 68
7 References .......................................................................................................................................... 69
8 Appendix A: Workflow Analysis .......................................................................................................... 76
8.1 Additional Process Maps ............................................................................................................ 76
9 Appendix B: Heuristic Evaluation ....................................................................................................... 78
9.1 Heuristic Evaluation Criteria ....................................................................................................... 78
9.2 ARMS Heuristic Violations .......................................................................................................... 82
10 Appendix C: Interviews ................................................................................................................. 111
10.1 Interview Instrument ................................................................................................................ 111
10.2 Interview Themes – Supporting Statements ............................................................................ 112
11 Appendix D: Cognitive Walkthrough ............................................................................................ 123
11.1 Walkthrough Themes – Supporting Statements ...................................................................... 123
12 Appendix E: Usability Testing Protocol ......................................................................................... 136
vii
12.1 Research student role............................................................................................................... 136
12.2 Items to give to participant (prior to testing) ........................................................................... 136
12.3 Scenario set-up ......................................................................................................................... 136
12.4 Introduction to study ................................................................................................................ 136
12.5 Training ..................................................................................................................................... 137
12.6 Experiment ............................................................................................................................... 137
12.7 Cases ......................................................................................................................................... 138
12.8 Usability and Usefulness Questionnaire ................................................................................... 138
12.9 Usability Testing Preference and Performance Results ........................................................... 139
viii
List of Tables
Table 1: Heuristic Evaluation Severity Rating Scale .................................................................................... 14
Table 2: Use Case Scenarios and Tasks....................................................................................................... 19
Table 3: Average Task Completion Times for Existing and Redesigned Interface Mockups ...................... 43
Table 4: Usability Survey Questions ........................................................................................................... 45
Table 5: ARMS Heuristic Violations ............................................................................................................ 82
Table 6: An e-referral system should effectively supplement or substitute the various modes of
communication utilized by physicians and administrators ...................................................................... 112
Table 7: Verbal communication between a referring and receiving physician is the most effective mode
of communication for referrals, and is absolutely necessary for urgent cases ....................................... 113
Table 8: Physicians do not want a system that will take more time than the current process, but may be
willing to if it is more useful ..................................................................................................................... 114
Table 9: There is currently no way to audit the multiple referral handoffs that occur ........................... 116
Table 10: Physicians desire an integrated experience when accessing clinical information from multiple
sources and systems ................................................................................................................................. 117
Table 11: Ubiquitous electronic health records would simplify the sharing of medical information,
documentation and imaging .................................................................................................................... 119
Table 12: An integrated scheduling system would simplify the appointment booking process ............. 120
Table 13: ARMS should better integrate with the other clinical information systems currently in use .. 123
Table 14: ARMS needs to be flexible in order to better support current practice in accommodating
potential referral pathways ...................................................................................................................... 127
Table 15: ARMS display of information and hyperinks should be optimized to enhance the visibility of
important links and information, and ease system navigability .............................................................. 129
Table 16: The radiation oncologist ARMS interface should better match their current practice by
removing erroneous links and information .............................................................................................. 132
Table 17: The system language should reflect the language used by radiation oncologists ................... 133
Table 18: Raw user survey results ............................................................................................................ 139
Table 19: Raw usability testing task times ............................................................................................... 140
ix
List of Figures
Figure 1: Generic Referral Process ................................................................................................................1
Figure 2: Technology acceptance model .......................................................................................................7
Figure 3: Combined framework .....................................................................................................................8
Figure 4: Study flow .................................................................................................................................... 12
Figure 5: Radiation Oncologist Referral Review Activity Diagram ............................................................. 23
Figure 6: ARMS Usability Issues by Severity ............................................................................................... 25
Figure 7: ARMS Home Screen ..................................................................................................................... 26
Figure 8: Attached Document Viewer ........................................................................................................ 28
Figure 9: Referral priority level in ARMS .................................................................................................... 29
Figure 10: ARMS Usability Issues by Heuristics Violated ............................................................................ 31
Figure 11: ARMS Usability Issues by Task ................................................................................................... 32
Figure 12: Mouse clicks and screens required to accept one referral and proceed to a second .............. 39
Figure 13: Redesigned home screen .......................................................................................................... 40
Figure 14: Redesigned referral details screen ............................................................................................ 41
Figure 15: Redesigned Accept Referral Screen .......................................................................................... 42
Figure 16: Confirmation Screen .................................................................................................................. 42
Figure 17: Task Completion Times with Significant Differences in Sample Mean (error bars indicate ±1
standard deviation) .................................................................................................................................... 44
Figure 18: Usability Survey Question Responses (error bars indicate ±1 standard deviation) .................. 46
Figure 19: Hierarchy of effectiveness in preventing errors ........................................................................ 49
Figure 20: "Next Referral" and "Return Home" buttons in redesigned interface ...................................... 60
Figure 21: Supporting medical information and attachments in redesigned interface ............................. 62
Figure 22: Possible improvement on referral home screen groupings for a future interface redesign .... 62
Figure 23: Department of Radiation Oncology referral process map ........................................................ 76
Figure 24: Ambulatory referral management system (ARMs) Flow Diagram ............................................ 77
1
1 Background and Rationale
1.1 Medical Referrals
A medical referral is a request for the transition of a patient’s care from one physician to
another [1]. Medical practice consists of two main types: general practitioners (GPs, often
referred to as a primary care physicians or PCPs) and specialists (specialty care physician, SCP)
[1]. Typically, a PCP, or community based healthcare professional, is trained to diagnose and
treat a large breadth of medical problems [1]. Since the patient lacks the medical expertise to
identify instances when a specialist is required and the type of specialist required, it is also the
GP’s responsibility to correctly refer a patient to a specialist when deemed necessary by their
primary assessment [2] [1]. A generic referral process is outlined in Figure 1 [3]. However, it is
common for many administrative tasks to facilitate the request from the primary care physician
to the specialist [4].
Figure 1: Generic Referral Process
In Canada, an initial referral from a PCP is usually the only mechanism for a patient consultation
with a specialist, except in some emergency situations [1]. The quality of specialty care is
enhanced when sufficient communication occurs between the PCP and specialist throughout
2
the referral process [5]. Currently, the referral process is predominantly paper-based with
communication facilitated through fax transmissions and phone calls [2] [1] [5].
1.1.1 Referrals and Patient-Centred Care
Patient-centred (or patient-centric) care is “the delivery of medical care to patients that
fundamentally respects and responds to individual patient preferences, wishes, and values,
while ensuring that patient values direct and pilot all clinical judgments and decisions,” [6] [7].
The Picker Institute (a global organization formerly dedicated to advocating for patient-centred
care) established 8 principles for patient-centred care, including access to care, and the
continuous and secure transition of a patient between healthcare settings [6] [8]. Effective
medical referrals between healthcare professionals facilitate and ensure a patient’s continuity
of care, and their secure transition of care, two core values of patient-centric care.
1.2 Inefficiencies in the Referral Process
Two main problem areas have been identified as contributing to inefficiencies in the patient
referral process: inefficient communication practices between primary and secondary care
providers, and inefficiencies resulting from the use of non-standard paper referral forms [9].
Referral inefficiencies can result in premature referrals from the referring physician,
dissatisfaction from both primary and secondary care providers, ambiguous expectations,
delayed diagnoses, fragmented patient care, and adverse patient outcomes [5].
Identified communication inefficiencies between care providers include: poor use of specialty
care services by referring physicians due to unclear guidelines around specialist roles,
inconsistent information sent by referring physicians indicating the motivation for a referral to
the specialist, and failure for the specialist to communicate their findings back to the referring
physician [9]. A study of the literature on patient referrals in the Netherlands found that, “30-
50% of referrals are ‘avoidable,’ while discharge letters [from the specialist to the GP] are
reported to be untimely, incomplete and often useless from the perspective of GPs,” [10]. Bal
provided the example, “that during 45% of consultations, GPs have no knowledge of changes in
medication that have occurred during the hospital stay of a patient,” [10]. These studies
3
highlight the challenges associate with the bi-directional communication necessary for an
efficient referral.
Identified Inefficiencies due to the use of non-standard paper referrals include: an increased
likelihood of incomplete referrals, and difficulties in referral processing and tracking [9]. A study
by Ferrari et al. found that paper based referrals are inefficient due to the lack of “consistent
and reliable transfer of essential information required to complete all facets of the consultation
process,” [11].
Ultimately, any information deficits can result in medical errors and adverse patient outcomes
[12]. Any hindrance to the referral process also undermines the principles of patient-centric
care, specifically access to care and the continuous and secure transition of a patient between
healthcare settings [6] [8].
1.3 Electronic Referrals
An electronic referral system (commonly referred to as an eReferral, or e-referral system), is a
health information technology that aims to automate the processes involved in referral receipt
and tracking, as well as create a communication link between PCP’s and specialists [9] [3]. E-
referrals offer a standardized platform for referral handling which makes for easier triaging of
patients and the ability to track referrals by both the PCP and specialist, thus reducing referral
inefficiencies [11] [3].
A time-cost comparison of emergency referrals in Alberta found that a standardized form and
label approach, which utilized a standard referral form and formatted label for patient
information, could save the average referral administrator two hours per day over the existing
process, reducing the workload from 9.8 hours to 7.7 hours [11]. Furthermore, the facilitation
of a standardized process through an electronic tool required only 3.1 hours [11].
In addition to time cost savings, a survey study at San Francisco General Hospital (SFGH) found
that medical specialty clinicians had difficulty in identifying the clinical question in 19.8% of new
patient visits that were referred by a paper based method, as compared to 11.0% for those
referred by e-referral (P = 0.03); for surgical specialty clinicians, the reduction was from 38.0%
4
to 9.5% (P < 0.001) [5]. Instances of inappropriate referrals also decreased with the introduction
of e-referrals, with inappropriate medical referrals decreasing from 6.4% to 2.6% (P = 0.21), and
inappropriate surgical referrals decreasing from 9.8% to 2.1%. (P = 0.03) [5]. Avoidable follow-
up visits by medical specialty clinicians reduced from 32.4% to 27.5% for all medical follow up
requests (P = 0.41), while avoidable surgical follow-up requests reduced from 44.7% to 13.5% (P
< 0.001) [5].
1.3.1 Electronic Referrals and Patient-Centred Care
From the perspective of the patient, an effective e-referral system improves access to specialty
care, reduces patient anxiety by informing them of their referral status and appointment
time(s), and aids in avoiding referrals which are unnecessary [9] [3]. The use of technology to
make these improvements and facilitate the referral process also reduces the frustration felt by
patients during the referral process [6]. Thus, an electronic referral system can both facilitate
access to care, and ensure a secure transition between healthcare settings [6] [8].
1.4 Patient Referrals at Princess Margaret Hospital
Princess Margaret Hospital (PMH) receives over 15,000 referrals per year (not including those
which are declined for incompleteness) from patients seeking consultations for cancer
assessment and treatment. Independent of this study, staff in the Princess Margaret Cancer
Program completed a significant amount of work to outline the referral processes throughout
PMH and standardize the referral forms for radiation medicine and medical oncology &
hematology. This work also identified much inefficiency and a high degree of variability in the
referral process amongst the different clinics and departments. For instance, most referrals to
the Radiation Medicine Program (RMP) bypass the New Patient Referral (NPR) office because
the department recognized their administrators as better specialized to handle the nuances of
radiation medicine referrals.
Even greater variability was identified in the Department of Surgical Oncology, where each
individual surgical clinic utilizes its own process for handling its respective referral intake. These
unstructured and complex processes result in inefficiencies which prolong the wait for quality
patient care [4]. An additional consideration was the need to accurately report patient wait
5
times to the Ontario Ministry of Health and Long-Term Care. In order to examine a wide range
of data across multiple care settings and administrative processes, it is beneficial to adopt a
“centralized, longitudinal approach to patient data,” rather than utilizing an interfaced network
of specialized systems [6]. This spurred the need for a standardized, central platform for
referral submission, receipt, tracking and reporting.
1.4.1 Ambulatory Referral Management System (ARMS)
The Ambulatory Referral Management system (ARMS) at the Hospital for Sick Children in
Toronto is a web-based application which provides electronic routing for patient referral,
including submission, review, triage and management as well as wait time reporting [13]. ARMS
had been successfully deployed at 51 SickKids clinics, providing evidence of its ability to handle
a reasonably complex clinical workflow (See Figure 24). Through comprehensive consultations
with the PMH eReferral committee and representatives from SickKids, it was determined that
the system was best suited among competing software platforms for referrals handling at PMH
because it could be adapted to the PMH workflow and could best support wait time reporting
to the Ontario Ministry of Health and Long Term Care.
1.5 Technology Adoption
Health information technology systems often lead to improved safety, efficiency, and quality in
healthcare; however, HIT projects frequently suffer from limited adoption by the end-user. [5].
While there is limited research on the adoption of e-referrals, there are studies which have
analyzed the adoption of other HIT systems such as electronic health records (EHR). It is
approximated that 75% of all large health information technology projects result in failure [14].
No single condition will contribute to the success or failure of a given information system, but a
previous study of 8000 information system projects found that the three top reasons for their
failure were lack of user input, incomplete requirements, and changing requirements [15] [16]
[17].
Lack of user input: A lack of support from physicians hinders the adoption of health information
technology [18]. Several studies have analyzed the barriers and facilitators to EHR
implementations. These potential barriers and facilitators motivate the need to understand
6
physicians’ (the system users) perceptions of, and potential interactions with, an electronic
referral system, prior to the system’s implementation.
Incomplete and changing requirements: According to Davis and Venkatesh, “feature creep is
the most common source of cost and schedule overruns,” [16]. While this is deemed to be
inevitable, clear “identification, correction, and prevention of requirements errors,” in the
earliest stages of product design are more easily fixed [16].
1.5.1 Diffusion of Innovations and Technology Acceptance Model
Diffusion of Innovations (DOI) is a theory that explains the innovation-decision process. This
process consists of: first knowledge of an innovation, forming an attitude towards it, and then
deciding whether to adopt or reject it [19] [20]. An individual will form their adoption decision
by weighing the advantages and disadvantages of utilizing an innovation. According to Rogers,
there are five perceived characteristics of innovations that will influence an individual’s decision
to accept an innovation [19] [20]:
1. Relative advantage – “the degree to which an innovation is perceived as better than the
idea it supersedes.” If an innovation has a clear advantage over a previous approach or
process, that innovation is more likely to be adopted. Conversely, it will not be adopted if
there is no perceived relative advantage.
2. Compatibility – “the degree to which an innovation fits with the existing values, past
experiences and needs of potential adopters.” An innovation is more likely to be adopted if
it accommodates the potential user.
3. Complexity – “the degree to which an innovation is perceived as difficult to understand and
use.” A less complex innovation is more likely to be adopted.
4. Trialability (divisibility) – “the degree to which an innovation may be experimented with on
a limited basis as it is being adopted.” An innovation that is divisible into a phased
implementation plan is more likely to be adopted.
5. Observability (communicability) – the degree to which the results of an innovation are
observed and communicated to others. An innovation is more likely to be adopted by users
if they can observe the benefits and results.
7
Rogers states that the rate of adoption will be determined, at least partially, by these
innovation characteristics [20] [21].
The technology acceptance model (TAM) is an information systems theory that predicts an
individual’s behavioural intention to use a computer technology [18]. It posits that the
perceived usefulness (PU) and perceived ease of use (PEU) of a given system will determine an
individual's intention to use that system,” [22] [23]. TAM further states that an individual’s
behavioural intention to use a system predicts actual system use [22] [23].
Figure 2: Technology acceptance model
A shortcoming of TAM is that it assumes when an individual has an intention to act, they will be
able to do so without limitation [23]. In reality, constraints such as limited ability, time,
environmental factors, organizational factors, or unconscious habits will limit their ability to act
on their intention [22] [23].
The TAM model was later extended (TAM2) to include additional constructs such as social
influence processes (subjective norm, voluntariness, and image) and cognitive instrumental
processes (job relevance, output quality, result demonstrability) [24]. The unified theory of the
acceptance and use of technology (UTAUT) is the most recent extension of TAM and consists of
four main constructs: performance expectancy (perceived usefulness), effort expectancy
(perceived ease of use), social influence, and facilitating conditions; these four constructs are
moderated by four variables: sex, age, experience, and voluntariness of use [25] [26].
Perceived
Usefulness (PU)
Perceived Ease of
Use (PEU)
Behavioural
Intention to Use
Actual System
Use
8
Combined, DOI and TAM provide a framework (Figure 3) for analyzing the adoption of clinical
information systems as both theories examine the “behavioural, social, and organizational
processes that both affect and are affected by clinical information systems,” [18]. Both DOI and
TAM argue that the adoption of a new technology is determined by its perceived attributes
[21]. The TAM constructs of perceived usefulness and perceived ease of use can be interpreted
as conceptually similar to two DOI innovation characteristics: perceived usefulness and relative
advantage both attempt to capture why a new technology may be superior to an existing
practice while perceived ease of use and complexity are opposites [21]. Under this assumption,
the combined framework is based on that used by Peeters et al. to determine the how
perceived attributes contributed to home telecare [19].
Figure 3: Combined framework
Similar frameworks have been applied to numerous healthcare technology adoption studies. In
a study analyzing the introduction of health information technology to nursing homes, Breen
and Zhang reasoned that new technology would enhance the job performance of nursing home
Relative Advantage (PU)
Complexity (PEU)
Behavioural Intention to Use
Actual System Use
Compatibility
Trialability
Observability
9
staff in terms of the care delivery and quality of care and provide a relative advantage over the
existing processes [27]. Breen and Zhang also contend that according to Davis, “technological
systems are more readily accepted when using such services are free from effort and
complexity,” and thus, healthcare systems generally prefer implementing technologies which
are easier to use [23] [27]. Another study, analyzing the adoption of home telecare by the
elderly or chronically ill, included the perceived attributes of relative advantage, compatibility,
complexity and observability as factors influencing home telecare adoption [19]. This simple
combined framework was chosen because it captured the full range of user perceptions that
contribute to the innovation decision process. However, it is important to note that other
complex frameworks account for correlation between these characteristics, as well as other
possible characteristics and influencing factors [26]
1.6 Usability Engineering and User-Centred Design
Usability engineering is an interdisciplinary1 field which is primarily concerned with human-
computer interaction and the cognitive processes of technology users [28]. Usability can be
defined as “the capacity of a system to allow users to carry out their tasks safely, effectively,
efficiently, and enjoyably,” [28]. The terms “perceived ease of use” and “perceived usefulness”
from the technology acceptance model are analogous to “usability” and “usefulness/utility,”
respectively, under the usability engineering paradigm as described by Nielsen [29] [30].
Nielsen further describes usability by 5 main attributes [29]:
1. Learnability – easy to learn
2. Efficiency – efficient to use
3. Memorability – easy to remember
4. Errors – low error rate
5. Satisfaction – pleasant to use
Traditionally, approaches to health information system testing have assessed the functionality,
safety, impact on costs, or efficiencies against a set of pre-defined goals; however, outcome-
1 “input from the behavioural, cognitive, and social sciences is essential for not just critiquing completed systems,
but also to provide essential input into the design process itself,” [27].
10
based analytics are unable to assess effects of technology on human cognitive processes [28]. A
user-centred design approach, “attempts to ensure that the system will be optimized for users’
abilities, wants, and needs, rather than forcing the user to accommodate the technology, by
compromising and changing their own routine,” [31] [32]. User centred design rest on three
core principles [33]:
1. An early focus on users and tasks.
2. Empirical measurement of product usage.
3. Iterative design whereby a product is designed, modified, and tested repeatedly.
Continued evaluation and user involvement throughout the development cycle ensures that
designer, user and organizational expectations are met, avoiding flawed or compromised
system implementation and adoption [16] [31]. User-centred design principles align with the
concepts of compatibility, trialability, and observability from Diffusion of Innovations theory by
emphasizing the existing user workflow, and involving potential users in the design process
such that they are able to experiment with the system as it is being developed and observe its
benefits [20] [33].
A user-centred approach that involved a participatory design strategy was used in the design
and implementation of the e-referral system at SFGH and is attributed to the successful
implementation of e-referrals at that hospital as it considered user needs and workflow early in
the development cycle and enabled continuous improvement in design changes [9]. Similar EHR
implementation studies have found a “positive correlation between physician involvement and
perceived ease of use,” which suggests that physician involvement through user-centred design
can contribute to a less complex system [14].
11
2 Research Question and Objectives
This study focuses on the adoption of an electronic referral system by radiation oncologists at
Princess Margaret Hospital, where the current paper based referral process is characterized by
inefficiencies. The primary research question for this study is:
What are the factors that contribute to an increased likelihood of electronic referral
system adoption by radiation oncologists in the Radiation Medicine Program at
Princess Margaret?
2.1 Objectives
There are four main objectives being addressed in this study to answer the research question:
1. Determine radiation oncologists’ perceived barriers and facilitators to adopting an
electronic referral system.
2. Determine the usability factors that contribute to radiation oncologists’ referral workflow
on an electronic system.
3. Incorporate the findings into the redesign of an existing electronic referral interface.
4. Determine user acceptance of the redesigned electronic referral system interface relative to
the existing interface.
2.2 Thesis Statement
An electronic referral system is likely to be adopted by radiation oncologists at Princess
Margaret Hospital if a user-centred design approach is utilized in its design, which considers the
users’ perceived barriers and facilitators to implementation and streamlines their referral
workflow through the computer interface.
12
3 Methodology
A user centred design methodology will be used for this study [34]. The study flow is outlined
below in Figure 4.
Figure 4: Study flow
The informed consent was received from all study participants in accordance with the approved
Research Ethics Board (REB) application submitted to the University Health Network for this
study.
3.1 Workflow Observations
Direct field observations assist in understanding the workflow of a given process, and allow the
investigator to become familiar with the process in a natural manner as it “reveal[s]
information that cannot be acquired in any other way, such as detailed physical task data, social
interactions, and major environmental influences,” [34] [35]. Observations are typically
presented using a standard workflow modelling notation such as Unified Modelling Language
(UML) or Business Process Modelling Notation (BPMN). UML is a standard modelling notation
for software and systems development, and other business processes which are non-software
dependent [34]. UML activity diagrams intuitively present the workflow process using
1 • Workflow observations of referral process from physician's perspective
2 • Heuristic evaluation of ARMS
3 • Interviews with radiation oncologists
4 • Cognitive walkthough of ARMS
5 • ARMS Interface redesign
6 • Observational usability testing of existing and redesigned ARMS interface
13
commonly used flow chart notation [34]. The aim of a UML diagram is to show how series of
tasks or activities relate to one another in order to accomplish the goal(s) of the system [34].
For this study, workflow analysis focused on understanding the existing referral process from
the perspective of the physician who is receiving a medical referral. A radiation oncologist was
observed in the office processing two referrals. The researcher took notes on the observations
to inform the accurate documentation of the workflow. This included the goals of referral
review, the tasks involved to achieve these goals, and any facilitators or barriers to achieving
these goals. Some brief discussion aided in clarifying these observations.
Referral review by physicians is a variable process. For many physicians, referrals are reviewed
at random intervals between patients and consist of little more than a brief overview of a
referral, with the majority of the processing tasks assigned to an assistant or referral
coordinator. This resulted in difficulty recruiting physicians for this aspect of the study, and only
one radiation oncologist from Princess Margaret Hospital was observed. Although this hindered
the generalizability of the observations, it still provided insight into the referral review process
and aided in the redesign of the e-referral system interface and the interviews provided
additional insight in order to corroborate these findings
Administrative secretaries play a significant role in the referral process. Several unsuccessful
attempts were made to observe administrative secretaries process referrals. No administrative
secretaries were observed in this study.
The observed process was documented using a standard unified modelling notation (UML). This
was complemented by a more generalized UML diagram which outlined the entire referral
process that was previously prepared by the Princess Margaret Cancer Program. Individual task
times were not measured due to the short duration of the observations, and limited number of
participants, but the total time was approximated.
3.2 Heuristic Evaluation of ARMs User Interface
Heuristic evaluation is a usability engineering method which is easily learned and requires little
time and resources to execute [36] [29]. It is a major method that falls under representational
14
analysis, which is “the process of identifying an appropriate information display format for a
given task performed by a specific type of user such that the interaction between the users and
the system is as direct and transparent as possible,” [37]. Heuristic evaluation is commonly
used for computer software and provides a clear basis for the identification of usability
problems in the design of any user interface [37]. Heuristic evaluation has been successful for
the evaluation of various medical devices and software such as infusion pumps, telemedicine
websites, and radiation therapy software [34]. It is a relatively easy and efficient method (for
those who are trained in the method) which can be used to identify a great number of usability
issues with very little time required [37]. For these reasons, it was an inexpensive and effective
method for the initial evaluation of an electronic referral system.
The heuristic method use for this study was Zhang et al.’s heuristics to evaluate patient safety
of medical devices [37]. Zhang’s evaluation criteria combine the standard approaches from
Nielsen’s heuristic evaluation and Shneiderman’s eight golden rules for good user interface
design. The single protocol from Zhang (referred to as Nielsen-Shneiderman, or Zhang
heuristics) is suitable to assess the interface usability of medical devices and applications [37].
The 14 Nielsen-Shneiderman heuristics are: Consistency and standards, visibility of system
state, match between system and world, minimalist, minimize memory load, informative
feedback, flexibility and efficiency, good error messages, prevents errors, clear closure,
reversible actions, use users’ language, users in control, and help and documentation; these
heuristics are further outlined in Section 8.1 [37]. Heuristic violations are rated on a severity
scale, typically from 0-4 (no violation to catastrophic), or another appropriate scale such as
“high, medium, low,” [34] [37]. The severity scale utilized for this study is defined in Table 1.
Table 1: Heuristic Evaluation Severity Rating Scale
Severity Score Definition
Critical 4 Usability catastrophe. Imperative to fix this before implementation.
Major 3 Major usability problem. Important to fix. Should be given high priority.
Minor 2 Minor usability problem. Fixing this should be given low priority.
Aesthetic 1 Aesthetic problem. Need not be fixed unless extra time is available.
Positive 0 A positive feature that meets usability criteria, and should be preserved.
15
This assessment was conducted by 3 researchers (the primary research student, a clinical
engineering student familiar with heuristic evaluation, and a member of the ELLICSR application
development team familiar with ARMS). The heuristic evaluation was conducted prior to
beginning the interviews and cognitive walkthrough session to allow the primary researcher to
become familiar with the ARMS system and more easily produce an accurate representation of
a mock user interface for the cognitive walkthrough. Each screen of the interface was
individually assessed by the evaluators based on the Nielsen-Shneiderman heuristics. These
three independent lists of usability issues were compiled into a master list by the primary
researcher. Each evaluator then independently assigned a severity to each of the identified
usability issues. An average of the three scores was then taken for each usability issue.
3.3 Interviews
Interviews are used in order to elicit an individual’s thoughts and behaviours, or to explore an
issue in depth [38]. Interviews are also a commonly used method for “determining system
requirements upon which systems are developed and also for evaluating the effects of newly
introduced health information systems,” [28]. The main benefits of the interview method is
that it provides more detailed information than can be captured through a questionnaire and in
a more relaxed environment in the context of a conversation [38]. While interviews tend to be
prone to bias, an assumption of belief elicitation interviews is that “subjective beliefs, although
they can be incongruent with reality, are important to assess because people’s behaviour is
based on their beliefs or perceptions of reality,” [38] [39].
For this study, interviews with physicians provided the opportunity to understand physicians’
perceptions of the referral process, and their perceived facilitators and barriers to e-referral
adoption. It was important to assess these assumptions given that people’s behaviour is based
on their perceptions of reality, and actual system use is inferred from their behavioural
intention [23] [39].
Five radiation oncologists from the Radiation Medicine Program at Princess Margaret Hospital
were interviewed. This sample size was deemed sufficient after preliminary analysis of the first
16
four interviews yielded some saturation in themes. Each interview focused on two parts:
understanding the existing referral process, and obtaining a contextual understanding of the
perceived drivers for, and barriers to e-referral adoption. Preliminary analysis yielded significant
overlap in the categories and themes between each interview part. Thus, interviews were left
intact for full analysis.
The primary researcher conducted the interviews based on a semi-structured script (see section
9.1), and secondary questions were asked when additional details or clarification were deemed
necessary by the researcher. The first part of the interview instrument (pertaining to the
existing process) was based on the 2011 report from the California Health Foundation on
electronic health record adoption techniques [40]. The second part of the interview instrument
was adapted from the methodology developed by Holden for assessing the facilitators and
barriers to physicians’ use of electronic health records [39].
Interviews were audio recorded. Analysis was based on Braun and Clarke’s guidelines for
thematic analysis [41]. Compiled interview passages were analyzed using thematic analysis,
whereby transcribed interview passages were broken apart into individual statements which
are each then assigned a specific code relating to the content of that statement [41]. Coded
statements are then grouped into overarching themes [41].
3.4 Cognitive Walkthrough of ARMs User Interface
A cognitive walkthrough is a form of task analysis which looks at the sequence of actions carried
out by a user to complete a task based on task scenarios from specifications or earlier
prototypes [28] [42]. A walkthrough may be conducted solely by investigators, or with a
potential user [42]. The user verbally “walks through” how they would interact with the
interface, usually presented as a paper or computer mock-up, for the given task scenario [42].
For a given task set, the analysts explicitly identify, for each required step in that task, the goal
that is involved, the user action required to achieve that goal, the behaviour of the system in
response to said action, and potential problems the user may encounter [28].
A cognitive walkthrough is useful because it does not necessarily require the high fidelity
environment that is typically called for in observational usability testing (see section 3.6). Thus,
17
it was easily performed in conjunction with physician interviews. This arrangement provided
qualitative feedback on the existing e-referral system interface from radiation oncologists prior
to the interface redesign.
The walkthrough was performed on a mock-up of the existing ARMs system interface with the
radiation oncologists in conjunction with their interview session. Axure RP Pro 6.5 was used to
create a webpage which mimicked the visual and functional interface of ARMS, but lacked the
functionality to process an actual patient referral. This mock-up was validated by a member of
the ELLICSR Application engineering team who was involved in the e-referral project at PMH
and familiar with ARMS.
Physicians were asked to “walk through” the regular common tasks of referral intake. These
tasks focused on tasks which would be typically performed by physicians: reviewing a patient
referral, requesting more information from a referring physician, forwarding a referral to
another physician, accepting the referral, and recommending an alternate care plan. While
interacting with the system, participants were asked to speak aloud to indicate the tasks they
were performing and to provide feedback on features of the system that they liked, or disliked.
Participants were prompted for further comment to explain the usability issues they identified.
Results were audio recorded and transcribed. Braun and Clarke’s guidelines for thematic
analysis were followed, whereby compiled walkthrough passages were analyzed using thematic
analysis, whereby transcribed interview passages were broken apart into individual statements
which are each then assigned a specific code relating to the content of that statement [41].
Coded statements are then grouped into overarching themes [41].Significant themes aided in
the redesign of the ARMs interface.
3.5 Interface Redesign
The results of the heuristic evaluation, workflow observations, interviews and cognitive
walkthroughs were used to redesign the ARMS interface in a mock-up environment using Axure
RP Pro.
18
Each of the identified heuristic violations was taken into consideration for the redesign.
However, those identified issues which were not a part of the receiving physician’s referral
review process were not included. Additionally, some of the violations could not be adequately
presented through a mock-up environment, and were excluded. Themes from the workflow
observations, interviews, and cognitive walkthroughs were taken into consideration when
redesigning the interface.
3.6 Observational Usability Test of ARMs and Prototype User Interfaces
A usability test involves a sample of end users interacting with the system of interest, typically
in a setting that matches their actual work environment, and being observed by an investigator
[34]. Hypothetical scenarios, referred to as use case scenarios, are postulated to the user, who
is asked to perform corresponding tasks on the system of interest [34] [28]. Subjects are
typically asked to think aloud as they carry out their tasks, which requires that the participant
vocalizes his or her thoughts, feelings, and opinions, while using the system [28] [43]. The
subject can be both audio and video recorded, while a video recording of the computer screen
the user is interacting with is recorded also [28]. Audio and video data is analyzed for
performance measures, such as: number of errors made, time required to complete tasks, and
count of negative comments or mannerisms [34]. Preference measures such as usefulness of
the product, or prototype preference can be determined through a questionnaire, and
complimented by qualitative observations made on user-system interactions which were
positive, or areas where the user experienced difficulty [34].
For this study, observational usability testing was conducted with a sample of five radiation
oncologists (three of which had also participated in the interviews and walkthrough) on a mock-
up of two different prototypes: the existing ARMS system, and a redesigned ARMS system.
Three use case scenarios were assessed for each interface: accepting a complete referral,
requesting more information for a new referral, and accepting a referral after receiving new
information. The tasks of forwarding a referral and rejecting a referral were not evaluated
because results from the interview informed the researcher that these tasks are not a part of
the existing, paper based receiving physician workflow.
19
Counterbalancing was attempted to mitigate the transfer of learning effects [33]. The order
with which each system was tested was alternated. While it would have been desirable to
counterbalance the use case scenarios for each of the interfaces in order to mitigate learning
effects, this would have required significant resources to build additional mock-ups to reflect
the different order of the referrals (scenarios) in the progression of mock-up screens.
Therefore, short prerequisite training on each system was utilized to minimize learning effects
prior to testing.
The prototypes loaded onto a laptop which was brought to the office of each participant. The
laptop was equipped with a microphone and webcam to record the participant, as well as
software to record the screen as they performed the testing. After the informed consent of the
participant had been obtained, the participant was provided with instructions on usability
testing. This included speaking aloud their thoughts and actions, or being aware of the
limitations of a mock-up system, along with short training on each of the mock-up interfaces.
The protocol can be found in section 11.
The audio and video recordings were analyzed with different performance measures being
assessed for each use case:
Table 2: Use Case Scenarios and Tasks
Use Case Task
Login and Navigate Home
Receive complete referral Accept 1st Referral
Confirm 1st Accept
Navigate to 2nd Referral
Receive incomplete referral Decide to Request Info
Request More Info
Navigate to 3rd Referral
Receive updated referral Accept 3rd Referral
Confirm 2nd Accept
Return Home
20
In addition to evaluating the audio and video footage, participants were assessed with a short
survey at the end of the usability testing session to determine which system s/he preferred
[30]. The questionnaire, based on one employed by Carayon et al. which assessed nurses’
acceptance of electronic health records (EHR), included items from established instruments for
measuring technology acceptance, EHR usability and EHR usefulness. The survey utilizes a 10-
point likert-scale ranging from “dislike/don’t want to use” to “like very much/eager to use.”
21
4 Results
4.1 Workflow Observations
Referral review by receiving physicians was directly observed in order to understand the
process in order to optimally redesign the user interface of an electronic referral system to
better match their existing workflow. The workflow was mapped using a UML diagram (Figure
5) and described in the section below. Referral review is a sub-process within the overall
referral intake process (See 7.1). The total duration of time to process two referrals was under 5
minutes.
4.1.1 Workflow Mapping
There are relatively few tasks that a receiving physician performs, particularly for non-urgent
and complete referrals. Referrals are also often reviewed in batches of two or more at a time.
Receive referral
Referrals are predominantly received via fax and are paper based. Physicians receive paper
referrals from their administrative secretary or referral coordinators. The secretary will place
referrals on the physician’s desk in their main work area (usually in close proximity to their
computer workstation) in a folder as referrals arrive, typically once a day with two referrals.
Read referral and attachments
The physician will review the paper referral for several elements, which vary by site group, but
typically include imaging and imaging reports (MRI, CT, or Ultrasound), pathology reports,
previous tests, clinical notes from previous physicians, and patient demographics.
Determine urgency
The receiving physician determines if the case is urgent based on the available information. If it
appears to be urgent, s/he will contact the referring physician directly. The receiving physician
would then instruct their administrative secretary to schedule an appointment within
approximately 24 hours.
Check for completeness
22
If the case is not urgent, the receiving physician checks that the referral is complete with
adequate information in order for them to make an informed consult decision. There is no
standard criteria for completeness, but could include the referral form, clinical notes, imaging,
imaging reports, and pathology reports.
Consult decision
Based on the submitted referral and supporting documentation, the receiving physician will
determine whether or not s/he will meet the patient for a consult, and if any additional test or
reviews are required.
Select preferred appointment date and time
The receiving physician may check the scheduling system (known as PHS at Princess Margaret)
for their next available appointment slot. S/he then returns the paper referral to their
administrative secretary with a written decision stating whether they will consult the patient,
possibly with a preferred appointment date and time written on that piece of paper. S/he may
also ask their secretary to request a pathology review and to ensure that imaging be sent to
their office, both of which may be written requests, or implied by unwritten policy. S/he will
proceed to the next referral, repeat the process, and hand the paper referrals back to their
administrative secretary to complete the remaining tasks.
23
Figure 5: Radiation Oncologist Referral Review Activity Diagram
24
4.1.2 Notable Observations
Some notable observations of the receiving physician’s referral process include:
Referrals to be reviewed by the receiving physician were left on their desk by their
administrative secretary without any other formal notification of their arrival.
Physician instructions to their administrative secretary were handwritten on the paper
referral, and not formally documented in any other system.
The physician referenced their scheduling system (PHS) when suggesting a consultation
date and time.
4.1.3 Identification of Issues
The main issue found during the workflow analysis relates to the reliance on paper to facilitate
the referral process and the observed inability to track or audit referrals that are being
processed. It is difficult to track a paper based process. For the receiving physician, there is no
way to track the referrals that they themselves have reviewed, accepted, requested additional
information, or rejected. Additionally, there is no record of their recommendations or
instructions, except on the paper referral. Until the administrative secretary acts on these
instructions, there is limited oversight regarding the status of a referral: whether it has been
received by the specialist, whether s/he has reviewed it, and whether they have reached a
decision regarding a consult. The difficulty in tracking paper referrals also raises privacy and
confidentiality concerns.
4.2 Heuristic Evaluation
Three researchers evaluated the ARMS e-referral system using the criteria in section 8.1. 41
critical or major usability issues and 110 minor or aesthetic usability issues were identified. 29
positive usability features were also identified. The complete list of identified usability issue and
positive features can be found in section 8.2. Many of the same violations were identified on
multiple screens, but were individually counted for each occurrence.
The initial heuristic assessments included screens which are used to submit a new referral.
While these screens may be utilized by administrative staff, they will never be used by receiving
physicians and were excluded from consideration in the system redesign.
25
4.2.1 Classification of Usability Issues by Severity
Figure 6: ARMS Usability Issues by Severity
Figure 6 summarizes the number of usability issues identified with the ARMS e-referral system
by severity. Excluding the positive features that were identified, the majority of the issues were
deemed minor or aesthetic. However, 36 major issues and 5 critical issues were also identified.
These high severity issues were categorized and grouped according to themes. The subsections
below outline the dominant themes which contained an issue with a severity of 2.5 or greater
for tasks that are part of the radiation oncologist workflow. Issue number(s) are in parenthesis,
which can be found in Appendix B (Section 8.2). Suggestions for redesign considerations were
also included according to the criteria outlined by Zhang and Nielsen.
4.2.1.1 No notifications for new referral submissions/assignments, or newly
submitted referral information (7, 12, 39, 42)
Four of the five critical heuristic violations pertained to the lack of adequate notifications for
newly submitted/assigned referrals or subsequently submitted referral information. The
specific violations included:
No automated alert is generated when a new referral fax is received. The user must login to
ARMS.
26
While a new fax icon does appear in the system, it is small, poorly located, and thus easy to
overlook (see Figure 7).
No notification is sent to a physician when a new referral has been assigned to them.
No automatic notification is sent to the receiving physician when the referring physician
submits additional information. The receiving physician (or their administrator) must
manually check for newly submitted information.
Figure 7: ARMS Home Screen
The current ARMS system would require the receiving physician to manually login to the system
and navigate to the home screen at regular intervals in order to check for new referrals, and
navigate to a specific referral to check for newly submitted information.
Suggestion: Make the new/updated referral notification more prominent and intuitively linked
to the referral
4.2.1.2 The referral system is not integrated with the resource scheduling system to
display available appointment times (32)
New fax notification
27
The fifth critical heuristic violation is the inability of the receiving physician to view their next
available appointment when accepting a referral. The current ARMS system would require one
to separately login to the scheduling system to check for their available appointment times, just
as they do when reviewing paper referrals.
Suggestion: Integrate ARMS with the scheduling system so that available appointment times
are provided.
4.2.1.3 The user must return to the home screen after accepting/rejecting a referral.
There is no intuitive link to proceed to the next referral (38)
A major violation was the inability to immediately proceed to the next referral in queue after
triaging the previous one. Currently, the system requires that the user navigate to the home
screen, and click on the next referral they intend to review.
Suggestion: Provide a “Next Referral” button on the confirmation screen so that the user can
skip the home screen.
4.2.1.4 None of the ARMS screens indicate the user that is currently logged into the
system, or which clinic is being viewed (13, 167)
Another major issue that was identified is that the user who is logged into the system is not
clearly identified on any of the screens. This was deemed a major issue because a user could
mistakenly manage another user’s incoming referrals and could potentially violate patient
privacy. Additionally, the home screen did not indicate which clinic (site group) those referrals
belonged to.
Suggestion: Clearly indicate which user is logged onto the system on the home screen.
4.2.1.5 The embedded document viewer to view TIFF or PDF referral attachments is
extremely small (29, 126)
Another major identified issue was the size of the embedded document (PDF/TIFF) viewer. Due
to its very small size, only a small portion of the documents can actually be viewed. As can be
seen in Figure 8, while a link is provided for opening the document in the external default PDF
or TIFF viewer, this link is small and easily missed.
28
Suggestion: Enlarge the embedded PDF viewer, or open attachments on the computer’s native
PDF program.
Figure 8: Attached Document Viewer
4.2.1.6 New and existing referrals are grouped together in the “Under Review”
category (41)
It was identified that when requesting more information for a referral, that referral remained
under the “Under Review” category on the home page, and is not distinguished from new
referrals which have not been reviewed. This would force the reviewing physician to recall
whether s/he had already viewed that referral, or access the referral’s “Audit History,” where
the referral interactions are tracked, but would require additional mouse clicks.
Suggestion: Create a new category for referrals that are pending.
4.2.1.7 Priority level assignments (when accepting a referral) could be misread (79,
89)
It was identified that the priority level assignment when accepting a referral could be misread
for two reasons. The priority levels are from right to left, then down (in rows, as opposed to
Document viewer
External program link
29
columns). Also, the priority level numbers (1, 2a, 2b, etc.) do not match the corresponding time
duration (24 hours, 1 week, 3 weeks, etc.) currently used for prioritizing referrals.
Figure 9: Referral priority level in ARMS
Suggestion: eliminate the priority level input. Possibly replace it with an “Urgent” checkbox or
match the ranking to the system already in use.
4.2.1.8 The referral search function is not reliable (177)
A major identified issue was inconsistent search results. When searching for two sample
referrals under the same patient name, only one result was retrieved when searching by first
name, while two results were retrieved when searching by last name.
Suggestion: Revisit the indexing used to enhance the search capabilities.
4.2.1.9 Referral views cannot be tracked (179)
The “View Audit Trail” link in ARMS allows the user to track any referral edits to demographic
data, medical data, or triage decisions and which user performed that action. However, there is
no way to audit who has viewed the referral, which would enhance referral confidentiality and
patient privacy.
Suggestion: Add referral views to the audit trail.
4.2.1.10 No visible “Back” or “Cancel” button on some screens (6, 61, 66, 169)
Some screens lacked a button to go back to the previous screen and cancel the current
operation. For instance, the “Select Clinic” screen did not have a method for cancelling a clinic
selection.
Suggestion: Add “Back” or “Cancel” buttons where appropriate.
30
4.2.1.11 Redundant or unnecessary data entry when accepting a referral (80, 90)
ARMS currently requires booking task information to be entered into the system. However, the
existing process involves the radiation oncologists’ respective administrative secretary
responsible for booking tasks. Additionally, “Referral Source” and “Referral Type” were
required to be re-entered on this screen, even though they would have been entered
previously by administrative staff.
Suggestion: Remove any redundant or unnecessary data entry.
4.2.1.12 Error messages were obstructed on some screens (84)
Some screens require specific data entry from the user. While incorrect data entry resulted in
an error message, it was obstructed by the title bar near the top of the “Accept Referral”
screen.
Suggestion: Ensure that error messages are clearly visible.
4.2.1.13 Referral details are not clearly organized (14)
When viewing referral details, demographic information is most prominent, while the relevant
attachments such as pathology or clinical notes are at the bottom of the screen and require the
user to scroll down.
Suggestion: Reorganize the referral details to make medical information more prominent and
move the attachment links near the top of the screen with other medical information.
31
4.2.2 Classification of Usability Issues by Heuristics Violated
Figure 10: ARMS Usability Issues by Heuristics Violated
Figure 10 summarizes the frequency and average severity for each type of heuristic violation.
Many usability issues violated more than one heuristic and thus the cumulative total in Figure
10 exceeds the total number of identified issues. “Memory”, “minimalist” and “match” were
the most frequently violated heuristics and accounted for 84 violations. This indicates that the
system frequently demands a high memory load from the user, does not utilize a minimalist
design, and frequently mismatches the real world processes. “Control”, “undo” and “message”
were found to have the highest average severities, although they were the least frequent
violations. This indicates that although they are few, there are issues which must be addressed.
They pertain to a perceived lack of user control over the system, an inability to reverse actions,
and poor error messaging.
32
4.2.3 Classification of Usability Issues by Task
Figure 11: ARMS Usability Issues by Task
Figure 11 shows the frequency of heuristic violations grouped by the associated system task.
“Submit new referral” and “book appointment” are not part of the receiving physician
workflow, but were still evaluated since they are major tasks. Additionally, elements of “book
appointment” could potentially be owned by the receiving physician, such as specifying a
desired appointment time for urgent referrals.
For those tasks specific to the receiving physician workflow, the most frequent heuristic
violations occurred for “forward referral”, “check referral” and “general” violations which
occurred throughout the system. The receiving physician tasks with the highest associated
severities were “notify physician”, “accept referral” and “general” violations throughout the
system.
4.3 Interviews
Four radiation oncologists from the Princess Margaret Cancer Program’s Radiation Medicine
Program agreed to participate in belief elicitation interviews. All were experienced physicians
who regularly receive referrals. Although none had used an e-referral system in their practice,
33
all are experienced computer users. Saturation was approached after interviewing four
participants.
Three major themes were identified, plus 1-2 subthemes for each dominant theme.
Communication: An e-referral system should effectively supplement or substitute the
various modes of communication utilized by physicians and administrators.
o Verbal communication between a referring and receiving physician is the most
effective mode of communication for referrals, and is absolutely necessary for
urgent cases.
Efficiency: Physicians do not want a system that will take more time than the current
process, but may be willing to if it is more useful (i.e. reliability, accessibility, auditing, and
security).
o There is currently no way to audit the multiple referral handoffs that occur.
Integration: Physicians desire an integrated experience when accessing clinical information
from multiple sources and systems.
o Ubiquitous electronic health records would simplify the sharing of medical
information, documentation and imaging.
o An integrated scheduling system would simplify the appointment booking
process.
Detailed tables with all supporting statements can be found in Appendix C (Section 9.2). The
three sections below present the dominant themes and subthemes identified from the
interview transcripts.
4.3.1 Communication
The interviews elicited several responses which indicated a wide variety of communication
methods utilized by receiving physicians. Participants indicated that they rely on explicit
communication modes including email, phone calls and faxes, primarily when interacting with
referring physicians. Participants also indicated that they also rely on implicit communication
modes with their administrative assistants. All participants indicated that an administrative
assistant would place a new referral on their desks – it was implied that this referral was
34
assigned to them for review. Table 6 in Appendix C (Section 9.2) outlines the passages which
support this theme.
Participants stressed that verbal communication between the referring and receiving physician
(i.e. a direct phone call) is the most effective mode of communication and is especially crucial
for urgent cases, which may be time sensitive. Table 7 in Appendix C (Section 9.2) summarizes
the passages which support this subtheme.
4.3.2 Efficiency
Some participants expressed their hesitation in adopting an electronic referral system, although
all were in agreement that the current referral process is not ideal. Their hesitation stemmed
from the perception that a computer based system could require more of their time to review a
referral, as compared to the existing paper based process. However, all participants recognize
the benefits of an e-referral system, and expressed willingness for adoption if these advantages
were realized.
One recognized advantage is the ability to track electronic referrals, which cannot be easily
achieved through the current paper-based process. Other relative advantages expressed by the
participants include: a reduced reliance on paper, ubiquitous access to referrals from any
computer, and an overall reduction in referral processing time.
Table 8 and Table 9 in Appendix C (Section 9.2) summarize the passages which support this
theme.
4.3.3 Integration
It was apparent that there are many information sources that receiving physicians may interact
with when handling a referral. In addition to receiving a referral by fax, they might speak to the
referring physician on the phone, or via email. If the referral is external, as most are, supporting
material might be faxed, or large image files stored on a CD might be mailed, or brought in with
the patient on the day of their consultation. If a patient is being referred from an associated
hospital, within the network of hospitals, the physician might access their electronic medical
record from the unified EHR for supporting information. Participants expressed a preference for
35
accessing fewer systems in order to adequately review a referral. Table 10 in Appendix C
(Section 9.2) contains the passages that support this theme.
Participants also conveyed that ubiquitous electronic health records, where all hospitals in the
region are seamlessly connected, would greatly simplify and enhance the sharing of patient
information between physicians. Table 11 in Appendix C (Section 9.2) contains the supporting
quotes for this subtheme.
In addition to patient information, receiving physicians must also be aware of scheduling to
ensure a balanced workload throughout the site group and to minimize the wait time for the
patient. This is currently achieved through the site group leader, or their administrator, who
evenly distributes the referrals while accounting for exceptions when a physician might be
away. Table 12 in Appendix C (Section 9.2) contains the supporting quotes for this subtheme.
4.4 Cognitive Walkthrough
Cognitive walkthroughs of the existing ARMS interface were performed with the interview
participants immediately following the interviews. Four radiation oncologists from the Princess
Margaret Cancer Program were asked to walk through some common tasks on the ARMS
electronic referral system, and their responses were audio recorded. Saturation as approached
after conducting walkthroughs with four participants. Three dominant themes and two
subthemes were identified from the cognitive walkthrough transcripts.
Integration: ARMS should better integrate with the other clinical information systems
currently in use
Flexibility: ARMS needs to be flexible in order to better support current practice in
accommodating potential referral pathways.
Usability: ARMS display of information and hyperinks should be optimized to enhance the
visibility of important links and information, and ease system navigability.
o The radiation oncologist ARMS interface should better match their current
practice by removing erroneous links and information
o The system language should reflect the language used by radiation oncologists.
36
Detailed tables with all supporting statements can be found in Appendix D (Section 10.1)
4.4.1 Integration
Participants indicated that they currently receive and transmit information through multiple
modes and systems. Medical records, including imaging, pathology reports, test results and
clinical notes, are often shared electronically when within the same institution. When a referral
is received from an external institution, information must be faxed and large imaging files must
be shared via CD. Upon reviewing the referral, the physician may instruct their administrator to
order a pathology review. Additionally, when receiving a referral, the receiving physician
typically consults their schedule in the hospital scheduling system, or their MS Outlook calendar
to determine the next available appointment time to consult with that patient. Table 12 in
Appendix D (Section 10.1) contains the supporting quotes for this theme.
4.4.2 Flexibility
Participants indicated that within the Radiation Medicine Program there is typically one
physician for each site group, or an administrator, who will triage the referrals and distribute
them amongst the physicians in that site group to sustain a balanced workload. In some
instances, the triaging physician who distributes the referrals to their colleagues may change,
due to coverage.
Participants confirmed that the receiving physician will usually accept the referral and notify
their respective administrative secretary to schedule the appointment, request any missing
information and send a confirmation to the referral source. If a referral is deemed unsuitable,
due to a scheduling conflict, it might be passed onto a colleague in RMP. If it is deemed
unsuitable for radiation oncology, it might be forwarded to another discipline, but is generally
returned to the referral source with the recommendation to send the referral to the
appropriate service. It was determined that in almost all cases, the radiation oncologist will ask
their administrator to perform the above tasks, rather than do it themselves.
It was also found that under the current process, if a radiation oncologist receives an
inappropriate referral, s/he is likely to advise the referring physician to send a new referral to
the appropriate specialist type. The radiation oncologist may suggest names of specialists to the
37
referring physician, but the radiation oncologist will not forward the referral themselves. One
participant even suggested that s/he is not allowed to forward a referral without the referring
physician’s consent, possibly due to liability concerns.
Table 14 in Appendix D (Section 10.1) contains the supporting passages for this theme.
4.4.3 Usability
It was evident at many points throughout the walkthrough that the participants had difficulty
identifying important information and navigating the system. For instance, one physician
indicated that there was no benefit to them seeing the patient’s detailed demographic
information at the top of the referral details page. Another physician also indicated that s/he
was most interested in seeing the supporting documents and key medical information, which
were not prominently located near the top of the referral details page. This forced that
physician to scroll to the bottom of the page to view the attached documents. Participants also
disliked small fonts and had some difficulty navigating the system as it was unclear how to go
“back” to a previous screen.
It was found that receiving physicians are not concerned with the referral processing tasks for
which they are not responsible. For instance, they had no interest in seeing the “booking tasks”
after accepting a referral. They expected that these tasks would be completed as required
without their involvement to ensure that patient shows up for their scheduled appointment
with the necessary information.
Additionally, participants had difficulty understanding the language used in ARMS, as it did not
directly correlate to the real world language they used when reviewing referrals. Participants
initially had difficulty differentiating between “New” referrals, and “Under Review” referrals on
the home screen. Participants also stated that severities which are assigned on the “Accept”
screen did not correspond to the RMP objective of consulting with all patients within two
weeks of a complete referral submission. There was also confusion regarding the difference
between assigning and forwarding a referral.
38
Table 15, Table 16, and Table 17 contain the supporting statements for this theme, and two
subthemes, respectively.
4.5 System Redesign
A mock-up ARMS interface was redesigned based on the findings from the preceding study
phases. A comprehensive list of changes can be found under the heuristic violations in section
8.2. Overall, the minimum number of screens required to accept a referral is reduced from 7 to
5, and the minimum number of clicks from 10 to 6. These changes are summarized in Figure 12.
Additional clicks would be required in order to view attachments and order tests, but this
number would remain constant, or decrease from the existing to redesigned interface. The
potential decrease would be due to the enhancement, or elimination of the embedded PDF
viewer.
39
Figure 12: Mouse clicks and screens required to accept one referral and proceed to a second
4.5.1 Home Screen
Figure 13 shows the redesigned ARMS home screen. The referral groupings on the ARMS home
screen is rearranged in an attempt to reflect the actual status of each referral: “To Be
Reviewed”, “Waiting for More Information from Referral Source”, and “Recently Accepted”.
40
New or updated referrals are clearly identified with bold red text. The referrals assigned to the
physician are clearly identified with that physician’s name in the heading. Undistributed
referrals to the user’s site group are also displayed. This allows individual radiation oncologists
to review additional referrals if they have not been assigned by their site group leader.
Additionally, this allows all radiation oncologists within a site group to cover as site group
leader and distribute referrals when coverage is required, or the position is assigned on a
rotating basis.
Figure 13: Redesigned home screen
4.5.2 Referral Details Screen
Figure 14 shows the redesigned referral details screen. Patient information is reorganized in
order to present pertinent medical information near the top, which reduces the scrolling
required by the physician. File attachments are moved near the top of the screen with the
medical information for the same reason. Numerical indicators, in this case age, PSA and
Gleason score (useful indicators for genitourinary cancer referrals) are displayed alongside the
attachments for visibility. Referring professional and patient demographic information is also
reorganized for consistency, such as keeping address information or phone numbers clustered.
Physician name
“New” or “Updated” status
41
Figure 14: Redesigned referral details screen
4.5.3 Accept Referral Screen
Figure 15 shows the redesigned accept referral screen. The available tasks have been divided
into two, with booking related requests on the left and other comments or instructions on the
right, reducing the need for scrolling. The next available appointment times are presented with
radial buttons if the receiving physician wishes to specify a desired slot. The “Priority Level”
radial buttons have been removed.
Attachments and numerical indicators
42
Figure 15: Redesigned Accept Referral Screen
4.5.4 Confirmation Screen
Figure 16 shows the envisioned redesigned accept referral confirmation screen. This is the
screen viewed immediately after clicking the “Accept Referral” button in Figure 15, eliminating
the “Booking Task” screen altogether. Additionally, a “Next referral” button has been added to
all confirmation screens so that the user does not need to return home in order to proceed to
the next referral.
Figure 16: Confirmation Screen
Available appointment
times
Next Referral
Button
43
4.6 Usability Testing for Redesigned E-Referral System
4.6.1 Task Completion Times
Tasks completion times were defined such that the start and end times were clearly identifiable
by mouse clicks. Allowances were subtracted from these times for instances of extended
comments from the participant.
Overall, the average time to complete all tasks in the redesigned interface was 239 seconds, as
compared to 342 for the existing interface, a thirty percent improvement of 103 seconds. As
shown in Table 3, task completion times were generally shorter for the redesigned interface,
than the existing one. One exception where the task time increased with the redesign was the
confirmation of accepting the final referral. This can be attributed to participants providing
additional instructions when accepting the referral in the redesigned interface, but not when
using the existing one.
Table 3: Average Task Completion Times for Existing and Redesigned Interface Mock-
ups
Task Existing, (s) Redesign, (s) Delta, Δ (s) % Increase
Login and Navigate Home 22 8 14 65%
Accept 1st Referral 82 63 19 23%
Confirm 1st Accept 53 32 21 40%
Navigate to 2nd Referral 33 4 29 88%
Decide to Request Info 32 25 7 21%
Request More Info 17 16 1 6%
Navigate to 3rd Referral 25 7 18 72%
Accept 3rd Referral 43 41 1 3%
Confirm 2nd Accept 27 37 (11) -40%
Return Home 9 6 4 39%
Total 342 239 103 30%
44
The true distribution of the data cannot be determined due to the small sample size, limiting
the extent of the statistical analysis. However, three of the ten measured times have a
significant decrease in sample mean and are shown in Figure 17. All three of these tasks pertain
to navigating the system and are instances where a screen was removed. The result was a net
decrease of 61 seconds for the overall scenario of three referrals.
Figure 17: Task Completion Times with Significant Differences in Sample
Mean (error bars indicate ±1 standard deviation)
4.6.2 User Satisfaction
Usability testing participants were assessed with a survey immediately following the completion
of tasks on each interface. All participants indicated that they had moderate to “very much”
experience with computer based clinical information systems. For all survey questions, none of
the participants expressed a preference for the existing interface over the redesign. Table 4
shows the survey questions and possible responses, as well as the p-values corresponding to
the difference in responses for the existing and redesigned interfaces. Figure 18 shows the
average response scores. For instances where the difference between the existing interface and
redesigned interface sample means were statistically significant (95% confidence interval), the
p-values are printed above the bars.
45
Table 4: Usability Survey Questions
Question Ex. Re. %
inc.
A Please circle the number that best reflects your acceptance of e-referrals: dislike very much and don’t want to use (1) – like very much and eager to use (5) 3.6 4.6 28%
B Learning to operate the system: difficult (1) – easy (5) 3.6 4.6 28%
C Exploring new features by trial and error: difficult (1) – easy (5) 3.6 4.4 22%
D Remembering names and use of commands: difficult (1) – easy (5) 3.0 4.6 53%
E Tasks can be performed in a straightforward manner: never (1) – always (5) 3.6 4.4 22%
F Help messages on screen: unhelpful (1) – helpful (5) 3.2 3.8 17%
G Experienced and inexperienced users’ needs are taken into consideration: never (1) – always (5) 3.6 4.6 28%
H Correcting your mistakes: difficult (1) – easy (5) 3.3 4.3 31%
I System is: difficult (1) – easy (5) 3.4 4.6 35%
J System is: frustrating (1) – satisfying (5) 3.6 4.4 22%
K Functions are as I expect: never (1) – always (5) 3.4 4.2 24%
46
Figure 18: Usability Survey Question Responses (error bars indicate ±1 standard
deviation)
The true distribution of the data cannot be determined due to the small sample size, limiting
the extent of the statistical analysis. However, it is clear that across five participants, all prefer
the redesigned interface over its predecessor in regards to the characteristics identified in Table
4.
4.6.3 Other Observations
Existing Interface
When using the existing interface, at least three of the participants expressed or displayed
some confusion when trying to navigate to the home screen after accepting a referral. It
was not immediately apparent that they had to click on the “Home” link in the left hand
menu in order to navigate to the next referral.
When using the existing interface, participants tended to want to click on the referrals listed
under “New,” rather than “Under Review.”
When using the existing interface, one participant mistakenly clicked on the second referral
after having already requested more information, due to its remaining in the same queue
47
position. At least two participants commented on this referral remaining in the “Under
Review” category.
Users rarely interacted with the left hand menu in the existing interface, and only did to
return to the home screen.
Redesigned Interface
When using the redesigned interface, some of the participants were confused by the
appearance of the third (updated) referral, and only recalled seeing two new referrals on
the home screen.
When using the redesigned interface, one of the participants expressed that s/he would
have liked to see more clinic information, in addition to the available appointment times, so
that s/he could determine how busy or full the clinic is. Another participant noted that
although s/he had selected an appointment time on accepting the first referral, the same
time slot was still available for selection for the second referral s/he accepted.
When using the redesigned interface, participants liked the presentation of the attachments
and numerical data. One participant indicated that the missing data was almost
immediately identifiable.
One participant indicated that s/he would have liked to see the triage option bar at the
bottom of the referral details to avoid scrolling back to the top of the page.
Users never interacted with the left hand menu options in the redesigned interface.
48
5 Discussion
This study utilized a user centred design methodology as means to enhance the usability and
usefulness of an electronic referral system, in order to increase its acceptance by a specialist
physician user population. The workflow observations, heuristic evaluation, interviews and
cognitive walkthrough provided a rich source of design features which were incorporated into
the redesigned e-referral interface. These redesign considerations were then validated or
refuted through observational usability testing. In addition to specific e-referral interface design
considerations, this study generated discussion points around other potential barriers and
facilitators to electronic referral adoption.
5.1 Workflow Observations
The referral review process, as conducted by radiation oncologists that receive referrals at
Princess Margaret Cancer Centre, was observed as being informal and lacking in structure.
Aside from the clinical judgement of the radiation oncologist, the process relied heavily on
informal policy, which varies across different clinical practices within the Radiation Medicine
Program.
Some instructions, such as contacting the patients to ensure they bring all required
documentation, were implied based on unwritten established practice or policy. However,
there was limited ability to confirm that instructions have been communicated and will be
properly executed under the current process. This idea is explored and discussed under the
interview phase of this study (see section 5.3).
Ensuring the confidentiality of medical documents also relies on staff adherence to formal
policy and informal policy. While the Radiation Medicine Program offices are relatively secure,
there is no formal method for preventing an unauthorized individual from viewing a patient
referral from a physician’s desk. Additionally, paper is easily misplaced or lost and the trail of
handoffs is difficult to track. These factors suggest that an electronic system, which can make
use of computerized security features, should enhance the confidentiality of the referral
process. However, a literature review of EHR implementation studies found “privacy and
security was the second-most mentioned factor,” in regards to EHR implementation, primarily
49
as a barrier due to the potential compromise of “the security or confidentiality of patient
information,” [44]. It is still a common attitude from some physician’s that it is their role to
individually protect patient data [45]. This is in contrast to the suggestion that “contemporary
health care requires a radical change in how confidentiality and privacy are defined (from a
property of the individual doctor-patient relationship, mediated by the human qualities of the
doctor, to a property of the system as a whole, mediated by technical and operations security
measures),” [44] [45]. The e-referral system would help in advancing this system wide approach
to ensuring patient’s privacy is protected.
According to hierarchy of effectiveness, rules and policies are less effective in preventing errors
than forcing functions or automation which can be provided by a technologically based system
[46]. This concept of intervention effectiveness is illustrated in Figure 19.
Figure 19: Hierarchy of effectiveness in preventing errors
An automated system such as ARMS allows for referral tasks to be forced and automated.
ARMS forces the user to take action on a referral in order to remove it from their queue. It
automates the transfer of the referral (and attachments) amongst physicians and
administrative staff, automates the correspondence to the patient or referring physician, and
automatically provides a record of all the tasks performed on a particular referral. These
automations are based on existing workflows, which consists of formal and informal rules and
1. Forcing functions and constraints
2. Automation / Computerization
3. Simplification / Standardization
4. Reminders, checklists, double checks
5. Rules and policies
6. Education and training
More
Effective
Less
Effective
50
policies. An e-referral system reduces the likelihood of errors which could result in “stale”
untriaged referrals or privacy and confidentiality issues.
The early focus on user tasks formalizes the workflow into the design of the e-referral system,
and the technology is adapted to the user, rather than adapting the user to the technology [31]
[32] [46]. The result is a more usable system which matches the actual workflow of the user
[29] [37]. According to Diffusion of Innovations theory, fitting a system to “existing values, past
experiences and needs of potential adopters,” increase system compatibility, making the
innovation more easily assimilated by the user [19] [20]. According to Technology Acceptance
Model, increased system compatibility positively influences its perceived ease of use, which
contributes to its acceptance [30] [23]. A workflow analysis is thus an integral part of a
comprehensive user-centred design.
5.1.1 Limitations
The referral review process for a receiving physician is very brief, informal, and unstructured
and conducted at irregular times between appointments. This led to significant challenges in
recruiting radiation oncologists to participate in workflow observations. Only a single
genitourinary radiation oncologist was observed, significantly limiting the generalizability of the
observed workflow for genitourinary radiation oncologists, as well as the Radiation Medicine
Program radiation oncologist population. Recruitment difficulties were likely due to the
demanding schedules of radiation oncologists. Recruitment concerns for this study phase were
partially alleviated by the results gathered from the interviews and walkthroughs, which
corroborated the results of the workflow observations.
Although this study’s focus was on the workflow of the specialist, it was evident that a
significant portion of the referral process is handled by the administrative secretary that is
assigned to that specialist. Additional roles in the referral process include the patient, the
referring physician and their administrator. As seen in Figure 23, the radiation oncology referral
process is complex, involving multiple individuals and systems for a single referral. In order to
fully understand the referral process, the workflows of all roles and systems must be analyzed
in order to ensure that a useful and easy to use electronic system is designed with all of the
51
end-users in mind. It is known that direct observations “reveal information that cannot be
acquired in any other way, such as detailed physical task data, social interactions, and major
environmental influences,” [34]. Therefore, it is beneficial to conduct workflow observations
with a representative sample of all e-referral system user roles.
5.2 Heuristic Evaluation
According to Nielsen, “the ideal is to present the information the user needs – and no more – at
exactly the time and place where it is needed,” [29]. While ARMS was generally able to present
the necessary information that the user needed at the right time, there’s was significantly more
information presented than was necessary, forcing the user to remember repetitive tasks such
as selecting the clinic to which they belong, or remembering to ignore the “booking task”
screen where tasks would likely be completed by their administrative secretaries. This
mismatch between the existing process and the ARMS based process contributed to memory
(n=46), minimalist (n=42), and match (n=40) as the three most frequently violated heuristics.
Many of the minimalist violations resulted from verbose and redundant instructions which
generally risk “confusing the novice user, but also slows down the expert user,” [29]. According
to Nielsen, an especially simple interface can prevent users from entering potential error
situations, and although this limits the available functionality, most users would be unlikely to
require advanced functions [29]. An experiment which introduced novice users to a word
processor found that they were able to learn basic word processor functions and type a letter in
116 minutes, versus 92 minutes when given a minimalist interface of the same system, which
did not allow for common errors to be made [29] [47].
Visibility violations (n=36) were the next most frequent violation type and resulted from
instances where the current system state, or available user interactions within the system were
not clear [37]. This included a lack of visual cues such as the current user, clinic and date,
missing page titles, unclear indicator for when a selection had been made (such as forwarding
clinic), and small or obscurely located information and hyperlinks. Consistency violations (n=31)
were the next most frequently violated heuristic due to the inconsistent visual layout on most
screens such as input and display misalignment and inefficient use of screen space.
52
Heuristic evaluation is an important tool for identifying usability issues in a timely and cost
effective manner and has proven to reduce instances of human error that occur due to poor
interface design [37]. As a method of representational analysis, a heuristic evaluation aims to
make the interaction between the user and their system direct and transparent, with minimal
barriers, as the user tries to complete their intended primary tasks [37]. Even for large-scale
information technology deployments of a mandatory nature, “users will find ways to get
around the system if it is difficult to use, or has negative consequences,” [9] [48]. Heuristic
evaluation contributed to reducing the effort required to operating an e-referral system, and
had a positive impact on its perceived ease of use [23]. As a construct of the technology
acceptance model, perceived ease of use will translate into system adoption [23]. A heuristic
evaluation is relatively cheap and easy method that can be applied to a system interface in
order to contribute towards its adoption.
5.2.1 Limitations
A limitation of the heuristic evaluation method used in this study is that no actual end users
(radiation oncologists) were involved in identifying usability issues. A case study by Nielsen had
the same interface evaluated by three groups of evaluators: usability novices, usability
specialists who were not specialized in the interface domain, and “double experts” who were
usability experts with the additional expertise of the usability domain. The identified usability
problems averaged by each group were 22%, 41% and 60% respectively [49]. The evaluators
were trained to identify usability issues and had some background in the patient referral
process, but none were double experts. The evaluators may have been unable to identify issues
that would be apparent to experienced radiation oncologists who frequently receive referrals.
While a double expert would be preferred for future study, it may prove difficult to recruit a
radiation oncologist, or any other medical specialist, who is also versed in usability heuristics.
5.3 Interviews
Respondents expressed the predominance of multiple communication methods when
interacting with referring physicians and when interacting with administrative secretaries or
other staff, and stated a desire for maintaining both flexibility and confidentiality. Most of these
existing communication modes, which have been characterized as inadequate in frequency and
53
quality, would be superseded by an e-referral system [5]. A study of e-referrals at San Francisco
General Hospital (SFGH) found that e-referrals facilitate iterative communication between
primary care physicians and specialists, thus allowing specialists to clarify the patient consult
prior to scheduling an appointment [5]. While radiation oncologists may be accustomed to a
wide array of communications modes, electronic referrals would in fact be a more effective
mode than those which are currently utilized. Additionally, rather than relying solely on an
administrative secretary to process referrals on the instruction of the specialist, some of these
tasks would be automated through physician input to the e-referral system. For instance,
administrative secretaries would not be required to physically place referrals on physician
desks, nor would physicians have to physically return them to their secretary. An e-referral
system can also automatically generate letters to the patient or referring physician. The SFGH
study affirmed that administrative staff reported less work due to electronic referrals, allowing
the saved time to be spent towards other clinic operations and administrative tasks [9].
Interview participants were adamant that for urgent cases, direct communication with the
referring physician was most desirable for clarity and to ensure a short wait time for the
patient. However, a study of the emergency referral process in Alberta Health Services (AHS)
found that an electronic referral system cut their administrative time cost by one-third (or less)
of their existing paper based process, which could potentially impact patient wait times [11].
While the SFGH study found decreased wait times for non-urgent cases, the same claim was not
made for urgent cases [50]. Additionally, access through an online portal could prevent delays
that arise from “stale” paper referrals which are left on an office desk or mailbox, and not seen
until the specialist returns to their office. However, the receiving specialist would still be
required to log on to the e-referral system to check for new referrals unless automated
notifications (such as email) were provided. In terms of radiation oncologist access, an online
portal provides a clear relative advantage over receiving paper referrals at the office.
While all of these features are likely to contribute to the usefulness of an e-referral system,
they do not align with the currently employed communication methods employed by radiation
oncologists; they are accustomed to processing paper copy referrals. Resistance to change has
54
been cited as the primary source of demotivation towards using EHR systems [44]. System
implementations often fail due to support being given to management values instead of staff
(user) values, and recent studies have shown that physicians were dissatisfied with health IT
systems because they disrupted their workflow [14]. The interviewed radiation oncologists
expressed a desire for an electronic referral system that would conform to their existing
workflow, and not significantly increase the time required to review and triage referrals.
However, radiation oncologists also expressed their willingness to accept spend more time
reviewing referrals through an electronic system than they do currently, if the e-referral system
provided a relative advantage over the existing process. In particular, they recognized the
inefficiencies that result from a paper-based process such as difficulty in tracking paper
referrals, and the advantage provided by an electronic system [9]. They also recognized the
increased accessibility of an online portal (ubiquitous computer access), as opposed to paper
referrals, which are currently only available at their office. An EHR study by Morton and
Wiedenbeck suggested that addressing the immediate needs of the physicians is imperative in
order for system acceptance [14]. This coincides with Diffusion of Innovations theory which
states that innovations must “have a clear, unambiguous advantage over the previous
approach,” and that it’s benefits or impacts must be observable in order to be adopted [19]
[20]. Therefore, the relative advantage, or increased usefulness, must be clearly stated and
emphasis should be placed on the increases in productivity and job performance that can be
achieved through an e-referral system in order to increase adoption by physician system users
[19] [21].
Interview participants also expressed that they wanted an integrated experience when
reviewing referrals and wanted to mitigate the need to access multiple systems to locate
clinical documentation from multiple sources. Patient electronic health records tend to be
fragmented across multiple institutions due to a lack of integration into clinical practices [51].
Participants stated that they currently rely on their administrative staff to facilitate and ensure
that all necessary information is available in time for the consult, and to ensure that the patient
brings any required data (such as diagnostic imaging) when required. Participants also stated
that ubiquitous electronic health records would mitigate instances of missing patient
55
information since physicians could have universal access to medical records. While a pan-
Canadian strategy is in place for integrated electronic health records, Canadian provinces and
territories are significantly lagging in their implementation [51].Therefore, physicians will rely
on an alternative process to facilitate the sharing patient health records amongst different
healthcare providers. This means continued reliance on administrative secretaries to facilitate
the transfer of clinical documentation. Conversely, an electronic referral system could facilitate
the sharing of health records and reduce the burden on administrative staff by providing a
means through which clinical documentation can be shared [9]. This has the potential to
increase the burden on the referring physician and their administrative staff to provide more
information and data upon referral submission, as was found at SFGH [1]. An electronic referral
system may reduce the burden on the patient to bring in medical records or imaging CDs to
their consult appointments, thus improving their experience. Although perceptions of the
referring physicians were not explored with respect to referrals made to PMH, Straus found
that even with the increased workload, referring physicians were enthusiastic about e-referrals
due to the professional satisfaction of improved specialty care access [9].
Participants also alluded to considerations for resource availability and integration with the
appointment scheduling system. Some radiation oncologists prefer to specify an appointment
time when accepting a referral, or check their schedule for availability prior to increasing their
patient load. Additionally, participants indicated that under current practices, administrative
secretaries may distribute referrals according to approximate patient load, and will ensure
appropriate coverage when a physician is away in order to ensure that all referrals are
reviewed. Therefore, an e-referral system should integrate with the appointment scheduling
system so that receiving physicians can easily check their availability. Straus et al. found that a
lack of system integration, including that with the specialty clinic’s scheduling system, resulted
in extra time and effort for e-referral system users [9]. A desire for interoperability is consistent
with other health IT implementations [44]. A review of EHR implementations which found that
a lack of interoperability was predominantly cited as a barrier to EHR implementation due to
the inhibited sharing of health data between different health institutions or environments [44].
56
The system should also provide for coverage such that referrals do not go unattended. In order
to support current practice, the redesigned system interface allowed all physicians for a
particular site group to view undistributed referrals to their site group in case that particular
user was responsible for distributing referrals amongst their team members. This is not unlike
the existing interface; however, the groupings and group titles on the home screen were made
explicit in the redesign. This emphasized a conformation to the existing referral distribution
workflow in the Radiation Medicine Program in order to increase system acceptance [14]. It
also adhered to usability heuristics which demand the visibility of the current system state [36]
[37]. As previously stated, this increased the perceived ease of use of the system, contributing
to actual system use [23].
5.3.1 Limitations
A limitation of the interviews in this study is the small sample size. As with the workflow
observations, recruitment of radiation oncologists posed a significant challenge. While
saturation was approached across four participants, additional interviewees from other site
groups may have elicited additional results.
Another limitation of belief elicitation interviews is that they tend to be prone to bias since
participants usually have a stake in the subject matter being discussed; however, attempts are
made to compensate for bias through the use of validated interview instruments [38]. While
user involvement is the core tenet of user centred design and positively correlates to perceived
ease of use, it is difficult for a user to predict how they will interact with a hypothesized future
system, although that system may yield superior performance results [29]. While some
evidence suggests that the opinions shared by users before having tried a system to be a poor
indicator of the users’ eventual opinion after using the system, these claims were made prior to
the established framework of the Technology Acceptance Model [52] [53]. This framework has
been repeatedly validated when applied to numerous information technologies both within and
outside of healthcare and has been useful in predicting technology acceptance [54]. The user-
centred design process also calls for continued evaluation and user involvement throughout the
development cycle and thus additional study phases were utilized to determine the factors
contributing to electronic referral system use.
57
5.4 Cognitive Walkthrough
The themes generated from the walkthrough of the existing ARMS interface overlapped slightly
with the interview themes, particularly in reiterating that physicians desire an integrated
experience when interacting with multiple clinical information systems such as scheduling. This
study phase shared findings with the heuristic evaluation, but provided greater insight into the
perspective of the radiation oncologist, allowing the redesign to better accommodate the
existing workflow of radiation oncologists into the e-referral system [31] [32] [46]. These
considerations for existing workflow yield a more usable interface [29] [37]. In turn, this
increased the system’s perceived ease of use, increasing the user’s behavioural intention to use
that system, and the acceptance of that technology [30] [23].
The walkthrough emphasized that an e-referral system needs to be flexible in order to
accommodate the different paths that a referral might take amongst physicians and
administrative staff. These different paths can be partially attributed to the division of the
Radiation Medicine Program into multiple site groups which focus on different cancer types
according to anatomical location. Most site groups rely on a site group leader, or their secretary
to distribute referrals to other specialists (through their respective secretaries) within their site
group, to maintain a balanced workload. This site group leader can be fixed, or rotating and can
also vary when physicians are on vacation. The inconsistent nature of this referral distributor
role suggested that all physicians within a site group should be able to view undistributed
referrals, and thus accommodate the existing workflow within the Radiation Medicine Program
[14]. This would allow any of the radiation oncologists within a site group to assume the role of
distributor to their colleagues, or accept additional patients. This consideration is consistent
with the notion that “complex innovation is generally more successful if responsibility for
operation decision making is devolved to front line teams,” [45] [55]. Rather than forcing
radiation oncologists to adhere to a new process, the existing operational practice was
accommodated in the system redesign.
One major assumption in ARMS was the inclusion of the “Forward” triaging function, a task not
currently performed by radiation oncologists. It was thought that with this function, a specialist
58
who receives an inappropriate referral could forward it to another specialty or physician for
review. The walkthrough revealed that radiation oncologists generally do not forward referrals
themselves. Rather, they reply back to the referring physician indicating that the referral was
inappropriate, and provide a suitable specialist type with potential specialist names. One option
is to leave the forward function as is, with the expectation that users will eventually increase
their comfort level with forwarding referrals. This option assumes that there are no liability
issues for the forwarding physician or their respective institution. This option should also
incorporate automatic correspondence to the original referral source, informing them that the
referral has been forwarded, in order to maintain communication with the original referring
physician [1]. This solution is in conflict with literature that suggests greater acceptance and
success can be achieved through system design changes, rather than training and workflow
adaptations [9]. Therefore, to accommodate the existing process, the second option is that the
“Forward” button could link to generate correspondence to the referring physician, with a list
of possible specialists or institutions for them to refer to. However, this existing process is also
flawed since it likely increases the patient’s wait time for specialty consultation, hindering their
access and transition to specialty care, with a possible negative impact on the patient
experience [6] [8]. Therefore, the first option is preferable, even though it does not strictly
adhere to user centred design principles, as it should result in a better transition of care for the
patient and enhance the patient experience, which is paramount.
The walkthrough also identified that the displayed information and system links should
enhance the visibility of important links or information while also easing system navigability.
This is consistent with what Nielsen calls a “simple and natural dialogue,” where interfaces are
as simplified as possible, and no more information is presented than exactly what the user
needs [29]. Thus, radiation oncologists were not concerned with available system links or
options that did not pertain directly to their referral review workflow, and found these features
to be distracting. Also, the interface language inadequately reflected the language used in real
life by staff in the Princess Margaret Radiation Medicine Program. Nielsen states that “the
terminology in user interfaces should be based on the users’ language,” and should make use of
59
standard terminology from the user community [29]. This provides a system that is more
compatible with the target users [20].
The usability deficits that the participants identified in the existing interface aligned closely with
the heuristic violations identified in the earlier phase of the study (See 5.2) [37]. Although the
issues identified in the walkthrough were not as numerous, the end user perspective yielded
issues that were not identified through the heuristic evaluation, specifically regarding language
and match to real life workflow, such as forwarding. Thus, the walkthrough, which relied on
input from system end users, was able to validate the findings from the heuristic evaluation and
correct any previously flawed assumptions [31].
5.4.1 Limitations
As with previous study phases, the recruitment of radiation oncologists posed a significant
challenge and resulted in a small sample size. While saturation was approached across 4
participants, a larger sample may have generated additional results.
This results generated in this study phase relied on the opinions of the user population, rather
than strictly objective observations. Differing user opinions can make it impossible to strictly
adhere to user input, and users often “do not know what is good for them,” [14] [29]. The
walkthrough was a low fidelity testing option which yielded results that were later validated or
refuted through observational usability testing. Usability testing prior to the redesign may have
generated similar and even additional results. However, due to challenges in recruitment, the
protocol combined the usability testing of the existing and redesigned interface into a single
session. Given that a comprehensive user centred design calls for iterative user input and
feedback as design changes are made, additional walkthroughs may be required as further
design changes are made [29].
5.5 Usability Testing
While there is a reasonable chance of success in basing interface design decisions solely on user
preference, there are many cases where the user will prefer a system that is measurably worse,
based on their task performance [52]. However, in this study, the preference and performance
data both favoured the redesigned e-referral interface. Participants indicated that they felt the
60
redesigned interface was easier to learn and that tasks could be performed in a more straight
forward manner. This is due in part to the “Home” screen and “Accept Referral” screen where
the biggest changes were made. In the existing interface, some users required prompting in
order to click on the correct referral. This was due to the grouping titles, where physicians were
required to click on the first referral “Under Review” and not “New”. The “nonstandard
meanings” of these titles were misleading and corrected in the redesign [29]. Usability testing
participants also required some prompting in the existing interface when they reached the
“Assign booking task” screen. This is because radiation oncologists do not book their own
patients, and this screen did not match their exiting workflow [29] [37]. The “Assign booking
task” screen was incompatible with existing values and past experiences would be a barrier to
adopting e-referrals using the existing ARMS interface, and validated the removal of this screen
which was decided after the heuristic evaluation and cognitive walkthrough [19] [20].
While most task times decreased with the interface redesign, only three tasks were observed to
have a significant decrease. These were navigation tasks between referrals, independent of the
actual referral review. The significant decrease in the three task times was achieved through
the removal of the “Select Clinic” screen, and the introduction of the “Next Referral” button
(Figure 20), which eliminated one screen and at least two clicks for each referral view by
avoiding the need to return to the home screen between referrals. This improved the usability
of the system by increasing the efficiency of the referral review process [29].
Figure 20: "Next Referral" and "Return Home" buttons in redesigned interface
Participants were observed making more mistakes with the existing interface when attempting
to navigate from one referral to the next. Literature suggests that user satisfaction may be
61
more dependent on instances of user error than task times, due to the salient nature of
performing an error [52]. Thus, in addition to reducing the task times, the “Next Referral”
button may have also contributed to increased user satisfaction by allowing tasks to be
performed in a more straight forward manner [29]. The “Next Referral” button also received
positive comments from one of the usability testing participants. This feature provided a “clear
unambiguous advantage” over existing ARMS interface and contributed to the increased
likelihood of adoption [19] [20].
One task time that appeared to favor the existing system due to an increased task time with the
redesign was the confirmation process when accepting of the third referral (Confirm 2nd
Accept). The increase in time was attributed to participants providing additional comments or
instructions when using the redesigned interface where they had not done so with the existing
interface. In this instance, the participants were more inclined to input comments and
instructions because of a clear layout that enhanced the visibility of the available input options.
While this step took longer, it is a clear example where the redesigned interface provided a
relative advantage over its predecessor and should contribute to e-referral system adoption
[20]. This also validated interview statements that suggested radiation oncologists would be
willing to spend more time on electronic referrals if it provided an advantage over the existing
process.
Another well received redesign element was the redesign of the supporting medical
information and supporting attachments under the referral details (Figure 21). The existing
interface contained limited to no information and attachments were located at the bottom of
the screen which required additional scrolling. Since referral attachments are predominantly
medical information, they were moved to the supporting medical information. Additionally, site
group specific numerical information was posted next to the attachments. These two features
received positive feedback during usability testing, likely because it clearly presented the most
pertinent medical referral information, that which they currently look for when reviewing a
paper referral, in an identifiable and straight forward manner [29] [36].
62
Figure 21: Supporting medical information and attachments in redesigned interface
Not all interface changes were positively received. Notably on the home screen, the referral
groupings were reorganized to help the user more easily identify the referrals for review;
specifically, new, undistributed referrals to the site group were labelled as such, while new
referrals distributed to the specific specialist user were explicitly labelled. Referrals waiting on
more information were also explicitly labelled. In the redesign, updated referrals (where more
information had been received) remained under the “Waiting for More Information from
Referral Source” category, and were marked as “Updated”. This caused some confusion as
users proceeded through their referrals using the “Next Referral” button and reached a third
referral, but only recalled viewing two “New” referrals on the home screen. A simple solution
may be to group updated referrals with new ones, rather than those still waiting on more
information (Figure 22).
Figure 22: Possible improvement on referral home screen groupings for a future interface
redesign
63
Another usability testing participant indicated that they would have preferred to have the
referral triage button bar at the bottom of the screen, in order to avoid additional scrolling back
to the top of the referral details, and provided flexibility when making a selection [37]. Another
participant indicated that they wished to see more scheduling information on the redesigned
“Accept Referral” screen, requiring greater integration with the scheduling system; however,
this capability is restricted due to limited interoperability, as is a common barrier with many
health IT systems [44]. Continued evaluation and testing would aid in identifying remaining
usability issues and allow for system design changes prior to implementation, thus increasing
acceptance by the radiation oncologist user population once implemented [9].
5.5.1 Limitations
Testing was performed using a mock-up of the existing and redesigned ARMS interface with
mock referrals. While this closely depicted the actual (and potential) systems and scenarios, the
limitations of the prototyping software, Axure RP Pro, hindered the inclusion of some existing
and potential e-referral functionalities, such as an embedded PDF viewer, or automatic
notifications for a new referral. The artificial nature of conducting the study could have also
affected the results [33]. The task of creating multiple referral scenarios in the prototyping
software also limited the researcher to focus on a single site group, (genitourinary). However,
only two of the five participants were genitourinary radiation oncologists. Thus, potential
usability issues specific to other site groups may have been missed. This also made it difficult to
plant errors, since the clinical knowledge of the participants with respect to genitourinary
cancers could have varied greatly. Therefore error detection rates by the user could not be used
as a usability measure. For this study, the capabilities of the software and the capacity of a
single researcher limited the sophistication of the mock-ups, but still allowed the fundamental
user workflows to be tested. However additional resources could allow for more sophisticated
mock-ups or even an actual e-referral system to be tested, depending on the stage of the
development cycle [33]. Less sophisticated mock-ups are typically used earlier, when the design
changes are anticipated to be most extensive, while fully working systems would be used later
on in order to provide the most realistic and comprehensive user experience [33]. This is to
64
ensure that the allocation of development resources is optimized throughout the design
process [33].
It was also observed during testing that none of the participants clearly discerned that the third
referral was old and had been updated for either interface, although older dates were clearly
visible in both scenarios. This is because it was found under the same list as the first two
referrals in the existing system, and while under a different list in the redesigned system, went
unnoticed due to the “Next Referral” button. This concept was a difficult to study in a relatively
short usability testing session. A pilot study would be better suited to evaluating the
effectiveness of the system for incomplete and updated referrals over a longer term (multiple
day) time duration.
As mentioned in the previous study phases, recruitment of radiation oncologists also posed a
significant challenge. The five radiation oncologists who participated also fell short of
representing the multiple site groups covered by the Radiation Medicine Program, and were
not fully representative of the target population, which is a common issue with usability testing
[33]. Five participants is the suggested minimum required to identify “the vast majority of
usability problems,” however the possibility remains that a severe problem could be
overlooked [33]. . Three of the usability testing participants also participated in the interviews
and walkthroughs. This means that the validated interface redesign elements may have been
biased towards their specific preferences, rather than being generalizable for the full radiation
oncologist population. A larger sample size may have provided adequate data for descriptive
statistical analysis of preference and performance data. Although not considered technology
“gatekeepers,” administrative secretaries would be frequent users of an e-referral system and
would provide a larger sample with which to work from. While administrative workflows for
referrals differ from physicians’, there is some overlap with regards to general interface
usability and features for an electronic referral system.
A consideration for this study was the location where the usability testing was conducted.
While high fidelity simulation labs may have been available, it was opted to conduct the
experiments at the respective offices of the test subjects, in order to increase the number of
65
volunteer participants for testing. The only available simulation labs would have required
participants to travel to a neighbouring institution and would have inhibited participation.
Instead, subjects were observed at location where they were most likely to review referrals,
although on a different computer workstation. The close proximity between the researcher and
the participant allowed the researcher to easily perceive the results; however, it is possible that
the researcher’s behaviour such as inadvertent speech or mannerisms could have affected the
test subject [33].
Only the primary workflow of referral review by radiation oncologists was evaluated, and
potential secondary workflows were not. For instance, it was observed that users never
interacted with the left hand menu, except when returning home in the existing interface. Left
menu use was eliminated altogether in the redesign through the “Next Referral” and “Return
Home” buttons. This suggests that the left hand menu could be further collapsed, or eliminated
altogether in order to achieve a more minimalist design [29] [37]. However, once e-referrals are
implemented, there may be exceptional workflows which require links or features from the left
hand menu to be easily visible and identifiable, so the user is aware of the system state [29]
[37]. This could include tasks like accessing the referral audit trail, or searching for a previously
booked referral. As suggested by the user centred design principle of iterative design, further
evaluation is required further into the development cycle to better determine the ideal
arrangement of the left hand menu to accommodate potential secondary workflows and will
likely require a fully functioning e-referral system due to prototyping limitations [33]. This
coincides with the Diffusion of Innovations concept of trialability, that an innovation is more
easily fully adopted when it can be trialed and experimented on as it is being adopted [19] [20].
66
6 Conclusion & Recommendations
A user-centred design aims to adapt a system to the users’ abilities, wants and needs, rather
than forcing users to adapt to the system. Literature suggests that the application of a user-
centred design can be effective in avoiding flawed or compromised system implementation and
adoption. This study contributed to this premise by demonstrating that through a user-centred
design approach, an existing electronic referral system could be better adapted to the workflow
of radiation oncologists at Princess Margaret Hospital such that they would be more likely to
accept the redesigned system. This methodology aided in identifying the facilitators and
barriers to electronic referral system adoption by radiation oncologists as defined by the
frameworks of Technology Acceptance Model and Diffusion of Innovations theory. These
frameworks outline the perceived attributes of a technology that contribute towards the users’
likelihood of adopting that technology. Usability engineering methods supported these
perceived attributes by enhancing the system’s usability (reducing complexity), engaging the
users with a more useful system as compared to existing processes and system, ensuring that
the system is compatible with existing values, providing observable results, and providing the
opportunity to trial the system (on a limited basis).
This study abided by the three core principles of user-centred design: an early focus on users
and tasks, empirical measurement of product usage, and iterative design whereby a product is
designed, tested, and modified repeatedly. The iterative approach began with workflow
observations that evaluated the existing referral review process for radiation oncologists. This
provided the researcher with an understanding of the existing referral process and its
associated tasks that the existing ARMS e-referral system would need to be compatible with. A
heuristic evaluation of the existing ARMS interface was performed in order to identify usability
issues that would hinder the perceived ease of use of the system. Interviews with radiation
oncologists corroborated the workflow observations and elicited some of their perceived
barriers and facilitators to electronic referral implementation. A cognitive walkthrough with the
same physician sample conducted on the existing ARMS interface elicited further design
considerations specific to their workflow preferences and raised other potential technology
adoption factors. Both the interviews and cognitive walkthrough elicited responses which aided
67
the researcher in understanding radiation oncologists’ perceived usefulness and ease of use of
the ARMS system.
Where possible, all of the preceding study phase results were taken into consideration in the
redesigned system interface mock-up. This mock-up was empirically evaluated against an
existing interface mock-up through observational usability testing. This final study phase
validated or refuted many of the changes made to the electronic referral system interface and
raised potential considerations for the next design iteration. The usability testing found that the
redesigned system interface was more efficient in completing referral review tasks than the
existing one and showed that radiation oncologists were overall more satisfied with the
redesigned interface experience. This study demonstrated that usability engineering principles
can be effective in contributing towards the adoption of a new technology or innovation, as
described by the Technology Acceptance Model and Diffusion of Innovations theory.
Not all of the redesigned e-referral interface elements were a demonstrable improvement over
the existing interface as they did not increase user efficiency or satisfaction. Additionally, many
of the identified usability issues, particularly those not specific to the referral review workflow
of radiation oncologists, were not addressed in this study. Additional unidentified interface and
system adoption issues for other potential user groups such as referring physicians or
administrative secretaries are likely.
6.1 Recommendations
Based on the results of this study, recommendations towards electronic referral system design
can be made that will contribute towards successful adoption of electronic referrals by
specialist physicians.
An electronic referral system should allow the specialist to review referrals as quickly and
efficiently as possible, as noted from the belief elicitation interviews and observed through
preference and performance data from usability testing.
An electronic referral system should integrate with other clinical information systems such
as scheduling in order to minimize the number of systems accessed by the specialist when
68
reviewing a referral, as noted in the interviews and cognitive walkthrough. This idea was
tested through the inclusion of appointment selection times when accepting a referral.
An electronic referral system should reflect the process used by the specialist at their
respective institution. It should only present the information and links necessary for a
specialist to reach a consult decision. Erroneous information, links and screens should be
kept to a minimum, or eliminated. For example, the removal of the “Booking” screen when
accepting a referral which reduced the overall referral review time and increased user
satisfaction. Pertinent medical information in order to reach a consult decision should be
clearly visible and accessible as demonstrated through the reorganization of the referral
details to highlight medical information and attachments, which was positively received by
usability testing participants.
6.2 Future Work
Workflow observations and interviews should be conducted with other specialist types,
administrative secretaries, site group leaders and referring physicians. This study focused on
specialist physicians receiving referrals, in this case radiation oncologists. However, in order to
promote system adoption by all potential users, those user groups which have not yet been
included in the study should also be observed and interviewed to better understand their
workflow and perceived system adoption factors. This will call for additional interface redesigns
and evaluation to be conducted with the other physician specialties, referring physicians, and
administrative staff so that any design modifications can be empirically validated. Multiple
iterations of redesign and testing will be conducted throughout the remainder of the system
development cycle, while recognizing that the fidelity of the testing may be constrained by the
available project resources.
Further study should also evaluate the user-centred design approach in the design and
implementation of other healthcare technologies. Given that this study confirmed that a user-
centred design can contribute to the adoption of e-referrals by a particular user group, it is
likely that a similar methodology can contribute to the adoption of other healthcare
technologies being deployed in other healthcare institutions.
69
References
[1] I. Reinhart, K. Dawoud, O. Shafiq, R. Alhajj, J. Rokne and S. Edworthy, "Electronic medical referral
system: A forum-based approach," in 2011 13th Ieee International Conference on e-Health
Networking Applications and Services (Healthcom), 2011.
[2] W. Almansoori, A. Murshid, K. Xylogiannopoulos, R. Alhajj and J. Rokne, "Electronic Medical Referral
System: Decision Support and Recommendation Approach," in Proceedings of the 2012 IEEE 13th
International Conference on Information Reuse and Integration, IRI, 2012.
[3] G. Alexander, "eReferral Strategy White Paper: Clearing the Communications Fog," Champlain and
South East Local Health Integration Networks, Ottawa, 2011.
[4] G. e. a. Deckard, "Improving timeliness and efficiency in the referral process for safety net
providers: Application of the lean six sigma Methodology," Journal of Ambulatory Care
Management, vol. 33, no. 2, pp. 124-130, 2010.
[5] J. Kim-Hwang, A. Chan, D. Bell, D. Guzman, H. Yee and M. Kushel, "Evaluating Electronic Referrals
for Specialty Care at a Public Hospital," Journal of General Internal Medicine, no. 10, pp. 1123-8,
2010.
[6] G.-M. Breen, T. T. H. Wan, N. J. Zhang, S. S. Marathe, B. K. Seblega and S. C. Paek, "Improving
Doctor-Patient Communication: Examining Innovative Modalities Vis-à-vis Effective Patient-Centric
Care management Technology," Journal of Medical Systems, vol. 33, no. 2, pp. 155-162, 2009.
[7] Institute of Medicine, Crossnig the quality chasm: A new health system for the 21st century,
Washington, D.C.: National Academy, 2001.
[8] Picker Institute, "Principles of Patient-Centered Care," January 2013. [Online]. Available:
http://pickerinstitute.org/about/picker-principles/. [Accessed 13 June 2013].
[9] S. G. Straus, A. H. Chen, H. F. Yee, M. B. Kushel and D. S. Bell, "Implementation of an electronic
Referral System for Outpatient Specialty Care," in American Medical Informatics Association Annual
Symposium Proceedings, 2011.
70
[10] R. Bal, F. Mastboom, H. Spiers and H. Rutten, "The product and process of referral: Optimizing
general practitioner-medical specialist interaction through information technology," International
Journal of Medical Informatics, no. 76s, pp. S28-S34, 2007.
[11] S. Ferrari, J. P. H. Wyse and Y. Hu, "Analysis of time cost for alternatives to enhance efficiency
within the medical emergency referral system in Alberta," in 2010 23rd Canadian Conference on
Electrical and Computer Engineering (CCECE), 2010.
[12] V. M. Remen and A. Grimsmo, "Closing information gaps with shared electronic patient summaries
– How much will it matter?," International Journal of Medical Informatics, vol. 80, no. 11, pp. 775-
781, 2011.
[13] T. Diep, J. Ponsonby and C. Robertson, Innovation in Managing the Paediatric Referral Process:
Ambulatory Referral Management system (ARMs), Sick Kids Hospital, 2011.
[14] M. E. Morton and S. Wiedenbeck, "A Framework for Predicting EHR Adoption Attitudes: A Physician
Survey," in Prospectives In Health Information Management 6, Fall 2009.
[15] M. Van der Meijden, H. Tange, J. Troost and A. Hasman, "Determinants of Syccess of Inpatient
Clinical Information Systems: A Literature Review," Journal of the American Medical Informatics
Association, vol. 10, no. 3, pp. 235-243, 2003.
[16] F. D. Davis and V. Venkatesh, "Toward Preprototype User Acceptance Testing of New Information
Systems: Implications for Software Project Management," IEEE Transactions on Engineering
Management, vol. 51, no. 1, pp. 31-46, February 2004.
[17] Standish Grp., "Charting the Seas of Information Technology," Dennis, MA, 1994.
[18] M. E. Morton and S. Wiedenbeck, "EHR Acceptance Factors in Ambulatory Care: A Survey of
Physician Perceptions," in Perspectives in Health Information Management, Winter 2010.
[19] J. M. Peeters, A. J. de Veer, L. van der Hoek and A. L. Francke, "Factors influencing the adoption of
home telecare by elderly chronically ill people: a national survey," Journal of Clinical Nursing, vol.
21, pp. 3183-3193, 2012.
71
[20] E. M. Rogers, Diffusion of Innovations, 4th ed., New York, NY: Free Press, 1995.
[21] M. Y. Yi, J. D. Jackson, J. S. Park and J. C. Probst, "Understanding information technology acceptance
by individual professionals: Toward an integrative view," Information & Management, vol. 43, pp.
350-363, 2006.
[22] S. Schneberger and M. Wade, "Technology Acceptance Model," 2011. [Online]. Available:
http://istheory.byu.edu/wiki/Technology_acceptance_model. [Accessed 1st April 2012].
[23] F. D. Davis, "Perceived usefulness, perceived ease of use, and user acceptance of information
technology," MIS Quarterly, vol. 13, no. 3, pp. 319-339, 1989.
[24] V. Venkatesh and F. D. Davis, "A Theoretical Extension of the Technology Acceptance Model: Four
Longitudinal Field Studies," Management System, vol. 46, no. 2, pp. 186-204, 2000.
[25] K. Zheng, R. Padman, D. Krackhardt and e. al., "Social networks and physician adoption of electronic
health records: insights from an empirical study," Journal of the American Medical Informatics
Association, vol. 17, pp. 328-336, 2010.
[26] V. Venkatesh, M. G. Morris, G. B. Davis and F. D. Davis, "User Acceptance of Information
Technology: Toward a Unified View," MIS Quarterly, vol. 27, no. 3, pp. 425-478, 2003.
[27] G.-M. Breen and N. J. Zhang, "Introducing Ehealth to nursing homes: Theoretical analysis of
improving resident care," Journal of Medical Systems, vol. 32, no. 2, pp. 187-192, 2008.
[28] A. W. Kushniruk and V. L. Patel, "Cognitive and usability engineering methods for the evaluation of
clinical information systems," Journal of Biomedical Informatics, no. 37, pp. 56-76, 2004.
[29] J. Nielsen, Usability Engineering, Morristown, NJ: Harcourt Brace & Company, 1993.
[30] P. Carayon, R. Cartmill, M. Blosky, R. Brown, M. Hackenberg, P. Hoonakker, A. Hundt, E. Norfolk, T.
Wetterneck and J. Walker, "ICU nurses' acceptance of electronic health records," Journal of
American Medical Informatics Association, no. 18, pp. 812-818, 2011.
[31] J. A. Cafazzao, K. Leonard, A. C. Easty, P. G. Rosos and C. T. Chan, "The user-centered approach in
72
the development of a complex hospital-at-home intervention," Studies in Health Technology and
Informatics, no. 143, pp. 328-33, 2009.
[32] D. Norman, The Design of Everyday Things, New York, NY: Basic Books, 1988.
[33] J. Rubin and D. Chisnell, Handbook of UsabilityTesting: How to Plan, Design, and Conduct
EffectiveTests, 2nd ed., Inidianapolis, IN: John Wiley & Sons, 2008.
[34] A. J. Chan, Improving Patient Safety During Radiation Therapy Through Human Factors Methods,
Toronto: Graduate Department of IBBME, University of Toronto, 2009.
[35] Y. Reimer and S. Douglas, "Ethnography, Scenario-Based Observational Usability Study, and Other
Reviews Inform the Design of a Web-Based E-Notebook," International Journal of Human-Computer
Interaction, vol. 17, no. 3, pp. 403-426, 2010.
[36] J. Nielsen and R. L. Mack, Usability Inspection Methods, New York, NY: John wiley & Sons, Inc.,
1994.
[37] J. Zhang, T. R. Johnson, V. L. Patel, D. L. Paige and T. Kubose, "Using usability heuristics to evaluate
patient safety of medical devices," Journal of Biomedical Informatics, no. 36, pp. 23-30, 2003.
[38] C. Boyce and P. Neale, "Conducting In-Depth Interviews: A Guide for Designing and Conducting In-
Depth Interviews for Evaluation Input," Pathfinder International, Watertown, MA, 2006.
[39] R. Holden, "What Stands in the Way of Technology-Mediated Patient Safety Improvements? A study
of Facilitators and Barriers to Physicians' Use of Electronic Health Records," Journal of Patient
Safety, vol. 7, no. 4, pp. 193-203, 2011.
[40] S. Kushinka, "Workflow Analysis: EHR Deployent Techniques," California Healtchare Foundation,
January 2011. [Online]. Available: http://www.chcf.org/publications/2010/03/ehr-deployment-
techniques. [Accessed July 2012].
[41] V. Braun and V. Clarke, "Using Thematic Analysis in Psychology," Qualitative Research in Psychology,
vol. 3, no. 2, pp. 77-101, 2006.
73
[42] J. Brender, Handbook of Evaluation Methods for Health Informatics, Burlington, MA: Elsevier
Academic Press, 2006.
[43] J. Horn, "The Usability Methods Toolbox Handbook," San Jose State University: Industrial and
Systems Engineering Department, San Jose, CA, 1998.
[44] C. McGinn, S. Grenier, J. Duplantie, N. Shaw, C. Sicotte, L. Mathieu, Y. Leduc, F. Légaré and M.
Gagnon, "Comparison of user groups’ perspectives of barriers and facilitators to implementing
electronic health records: a systematic review," BMC Medicine, vol. 9, no. 46, 2011.
[45] T. Greenhalgh, K. Stramer, T. Bratan, E. Byrne, Y. Mohammad and J. Russell, "Introduction of shared
electronic records: multi-site case study using diffusion of innovation theory," British Medical
Journal, vol. 337, p. a1786, 2008.
[46] O. St-Cyr and J. A. Cafazzo, "From Discovery to Design: The Evolution of Human Factors in
Healthcare," Healthcare Quarterly, vol. 15, no. Special Issue, pp. 24-29, 2012.
[47] J. Carroll and C. Carrithers, "Training wheels in a user interface," Communications of the ACM, vol.
27, pp. 800-806, 1984.
[48] J. Halbesleben, D. Wakefield and B. Wakefield, "Work-arounds in health care settings: literature
review and research agenda.," Health Care Management REview, vol. 33, pp. 2-12, 2008.
[49] J. Nielsen, "Finding usability problems through heuristic evaluation," in Proc. ACM CHI'92 Conf.,
Monterey, CA, 1992.
[50] Y. e. a. Kim, "Not Perfect, but Better: Primary Care Providers’ Experiences with Electronic Referrals
in a Safety Net Health System," Journal of General Internal Medicine, vol. 24, no. 5, pp. 614-9, 2009.
[51] M.-P. Gagnon, N. Shaw, C. Sicotte, L. Mathieu, Y. Leduc, J. Duplantie, J. Maclean and F. Legare,
"Users' perspectives of barriers and facilitators to implementing EHR in Canada: A study protocol,"
BioMed Central, vol. 4, no. 20, 2009.
[52] J. Levy and J. Nielsen, "Measuring usability: preference vs. performance," Communications of the
ACM, vol. 37, no. 4, p. 66, 1994.
74
[53] R. Root and S. Draper, "Questionnaires as a software evaluation tool," in Proceedings of ACM
CHI'83, New York, NY, 1983.
[54] R. Holden and B. Karsh, "The Technology Acceptance Model: Its past and its future in health care,"
Journal of Biomedical Informatics, vol. 43, pp. 159-172, 2009.
[55] T. Greenhalgh, G. Robert, F. Macfarlane, P. Bate and O. Kyriakidou, "Diffusion of innovations in
service organisations: systematic literature review and recommendations for future research,"
Millbank Quarterly, vol. 82, pp. 581-629, 2004.
[56] D. McLaughlin and J. Hays, Healthcare Operations Management, Chicago, Il: Health Administration
Press, A division of the Foundation of the American College of Healthcare Executives, 2008.
[57] J. Seo, S. Kim and S. Kim, "Moderating Effect of Individual Differences of the Adoption of U-
Healthcare Service," in Picmet 2008 Proceedings, 2008.
[58] C. Friedman and J. Wyatt, Evaluation Methods in Biomedical Informatics, New York, NY: Springer
Science+Business Media Inc., 2006.
[59] M. Morris, V. Venkatesh and P. Ackerman, "Gender and Age Differences in Employee Decisions
About New Technology: An Extension to the Theory of Planned Behavior," IEEE Transactions on
Engineering Management, vol. 52, pp. 69-84, 2005.
[60] B. Kaplan and K. D. Harris-Salamone, "Health IT Success and Failure: Recommendations from
Literature and an AMIA Workshop," Journal of the American Medical Informatics Association, vol.
16, no. 3, pp. 291-299, May/June 2009.
[61] J. Cafazzo and O. St-Cyr, "From Discovery to Design: The Evolution of Human Factors in Healthcare,"
Healthcare Quarterly, vol. 15, pp. 24-29, 2012.
[62] S. Urowitz, D. Wiljer, K. Dupak, Z. Kuehner, K. Leonard, E. Lovrics, P. Picton, E. Seto and J. Cafazzo,
"Improving diabetes management with a patient portal: a qualitative study of diabetes self-
management portal," Journal of Internet Medical Research, vol. 14, no. 6, p. e158, 2012.
[63] M. Wilkins, "Factors influencing acceptance of electronic health records in hospitals," Perspectives
75
in health information management/AHIMA, American Health Information Management
Association, vol. 6, no. Fall, 2009.
76
7 Appendix A: Workflow Analysis
7.1 Additional Process Maps
Figure 23: Department of Radiation Oncology referral process map
77
Figure 24: Ambulatory referral management system (ARMs) Flow Diagram
78
8 Appendix B: Heuristic Evaluation
8.1 Heuristic Evaluation Criteria
The following list outlines the heuristic evaluation criteria used in this study [37].
1. Consistency – Consistency and standards. Users should not have to wonder whether
different words, situations, or actions mean the same thing. Standards and conventions in
product design should be followed.
Sequences of actions (skill acquisition).
Color (categorization).
Layout and position (spatial consistency).
Font, capitalization (levels of organization).
Terminology (delete, del, remove, rm) and language (words, phrases).
Standards (e.g., blue underlined text for unvisited hyperlinks).
2. Visibility – Visibility of system state. Users should be informed about what is going on with
the system through appropriate feedback and display of information.
What is the current state of the system?
What can be done at current state?
Where can users go?
What change is made after an action?
3. Match – Match between system and world. The image of the system perceived by users
should match the model the users have about the system.
User model matches system image.
Actions provided by the system should match
Actions performed by users.
Objects on the system should match objects of the task.
4. Minimalist – Any extraneous information is a distraction and a slow-down.
Less is more.
Simple is not equivalent to abstract and general.
Simple is efficient.
79
Progressive levels of detail.
5. Memory – Minimize memory load. Users should not be required to memorize a lot of
information to carry out tasks. Memory load reduces users’ capacity to carry out the main
tasks.
Recognition vs. recall (e.g., menu vs. commands).
Externalize information through visualization.
Perceptual procedures.
Hierarchical structure.
Default values.
Concrete examples (DD/MM/YY, e.g., 10/20/99).
Generic rules and actions (e.g., drag objects).
6. Feedback – Informative feedback. Users should be given prompt and informative feedback
about their actions.
Information that can be directly perceived, interpreted, and evaluated.
Levels of feedback (novice and expert).
Concrete and specific, not abstract and general.
Response time.
o 0.1 s for instantaneously reacting;
o 1.0 s for uninterrupted flow of thought;
o 10 s for the limit of attention.
7. Flexibility – Flexibility and efficiency. Users always learn and users are always different. Give
users the flexibility of creating customization and shortcuts to accelerate their performance.
Shortcuts for experienced users.
Shortcuts or macros for frequently used operations.
Skill acquisition through chunking.
Examples: Abbreviations, function keys, hot keys, command keys, macros, aliases,
templates, type-ahead, bookmarks, hot links, history, default values, etc.
8. Message – Good error messages. The messages should be informative enough such that
users can under-stand the nature of errors, learn from errors, and re-cover from errors.
80
Phrased in clear language, avoid obscure codes.
o Example of obscure code: ‘‘system crashed, error code 147.’’
Precise, not vague or general. Example of general comment: ‘‘Cannot open document.’’
Constructive.
Polite. Examples of impolite message: ‘‘illegal user action,’’ ‘‘job aborted,’’ ‘‘system was
crashed,’’ ‘‘fatal error,’’ etc.
9. Error – Prevent errors. It is always better to design interfaces that prevent errors from
happening in the first place.
Interfaces that make errors impossible.
Avoid modes (e.g., vi, text wrap). Or use informative feedback, e.g., different sounds.
Execution error vs. evaluation error.
Various types of slips and mistakes.
10. Closure – Clear closure. Every task has a beginning and an end. Users should be clearly
notified about the completion of a task.
Clear beginning, middle, and end.
Complete 7-stages of actions.
Clear feedback to indicate goals are achieved and current stacks of goals can be
released. Examples of good closures include many dialogues.
11. Undo – Reversible actions. Users should be allowed to recover from errors. Reversible
actions also encourage exploratory learning.
At different levels: a single action, a subtask, or a complete task.
Multiple steps.
Encourage exploratory learning.
Prevent serious errors.
12. Language – Use users’ language. The language should be always presented in a form
understandable by the intended users.
Use standard meanings of words.
Specialized language for specialized group.
User defined aliases.
81
Users’ perspective. Example: ‘‘we have bought four tickets for you’’ (bad) vs. ‘‘you
bought four tickets’’ (good).
13. Control – Users in control. Do not give users that impression that they are controlled by the
systems.
Users are initiators of actors, not responders to actions.
Avoid surprising actions, unexpected outcomes, tedious sequences of actions, etc.
14. Document – Help and documentation. Always provide help when needed.
Context-sensitive help.
Four types of help.
o task-oriented;
o alphabetically ordered;
o semantically organized;
o search.
Help embedded in contents.
82
8.2 ARMS Heuristic Violations
Table 5: ARMS Heuristic Violations
No.
Task Screen Issue Violated Heuristic
Severity
Solution
1 Login Login There will only be a single windows login for RMP users.
Consistency
1.00 Lotus Notes option removed
2 Login Login
The account type can be selected as Lotus Notes, or Windows. The default selection should be Windows.
Match, Memory, Flexibility
1.00
Initially set Windows as default. Lotus Notes option then removed.
3 Login Select Clinic
The user is required to select the clinic to which they want to sign in to. If the user is only assigned to one clinic, the system does not login directly to that clinic.
Minimalist, Memory
1.67
User automatically logged into their clinic. Clinic selection screen removed.
4 Login Select Clinic
When the desired clinic is clicked, the user is immediately directed to the home page - no additional click is required.
Minimalist 0.00 Positive feature
5 Login Select Clinic
This is the first screen the user sees after logging on. However, on subsequent screens, selecting "Home" takes the user to the respective Clinic referral page, not the first viewed screen.
Match, Memory
2.33
User automatically logged into their clinic. Clinic selection screen removed.
6 Login Select Clinic
No visible back or cancel button once a clinic has been selected. User must know to navigate to
Undo 3.00
User automatically logged into their clinic. Clinic
83
the select clinic page from the left hand menu.
selection screen removed.
7 Check Referral
Home
A notification is displayed in a small yellow dialog box on the upper right hand side of the screen for "New Fax" and "Referrals ready for EPC". The notifications are extremely small and away from most of the key page information.
Visibility, Feedback
3.67
Dialog removed. "New" shown in bold red next to referrals in queue.
8 Check Referral
Home
Referrals are grouped into four groups (New, Under Review, To be Booked, and Referrals to Other Clinics). The group blocks poorly utilize the available screen space and could be larger to allow the referral data to be more easily viewed.
Visibility 2.33 Referrals regrouped.
9 Check Referral
Home
Referrals are grouped into four groups (New, Under Review, To be Booked, and Referrals to Other Clinics). The subtitles for each referral group is in a small font and could be made more prominent.
Visibility 2.00 Font size increased.
10 Check Referral
Home
Referrals are listed in a table with referral ID, patient name, problem, date, and
Flexibility, Match
2.00 Entire row hyperlinked.
84
urgency. Only the referral ID is a clickable link to view referral details. The rest of the row, including patient name, are not linked.
11 Check Referral
Home
Referrals are listed in a table with referral ID, patient name, problem, date, and urgency. Referrals are not sortable by the various header columns, reducing navigability.
Flexibility 2.00
N/A - Difficult to mock-up and test.
12 Check Referral
Home
New referrals will be predominantly submitted by fax and viewed as PDFs in the system. No automated alert is generated when a new fax is received and could result in an urgent referral being missed.
Memory, Feedback
4.00
N/A - Not part of rad onc workflow
13 Check Referral
Home
The currently viewed subclinic is not displayed on the home page. Although the user would have made this selection on a previous screen, confirmation of this selection could prevent potential error.
Visibility, Error
3.00 Clinic name visible on home page
14 Check Referral
View Referral Details
Overly detailed demographic and referral source information is first viewed on the screen. The medically
Consistency
2.67
Medical information and attachments moved to top of
85
relevant attachments are located at the bottom of the screen for the user to click and open.
screen.
15 Check Referral
View Referral Details
"View audit trail", "View referral comment list" and "return to home" links are small and obscurely located on the upper right hand side of the screen.
Consistency, Visibility
1.67 Upper right hand links removed
16 Check Referral
View Referral Details
If a referral has been forwarded, a message is included above the referral details stating who it has been forwarded to.
Visibility, Feedback, Match
0.00 Positive feature
17 Check Referral
View Referral Details
Instruction should be rephrased to second person.
Language, Control
1.00
Instructions reworded where appropriate.
18 Check Referral
View Referral Details
Links on the button bar appear to "Pop-up" when the cursor hovers.
Visibility, Memory, Feedback, Error
0.67 Positive feature
19 Check Referral
View Referral Details
Referral comment list and supporting documents are clearly outlined at the bottom of the page, separate from the referral information.
Consistency
1.00
Attachments moved to top of screen.
20 Check Referral
View Referral Details
Referral details are broken apart into sections: Clinic information, Medical information & reason for referral, Supporting medical information, Referring professional
Memory 0.00 Positive feature
86
and Patient information.
21 Check Referral
View Referral Details
Supporting medical information is clearly listed down a single column.
Consistency, Feedback
0.00 Positive feature
22 Check Referral
View Referral Details
Text on the button bar is hyperlinked, but the icons are not.
Flexibility 1.67 Icons hyperlinked
23 Check Referral
View Referral Details
The additional home link on the top right of the page is redundant since there is always a home link at the top of the left hand navigation pane.
Minimalist 1.00 Upper right hand links removed
24 Check Referral
View Referral Details
The button bar has four options for handling the currently viewed referral: Assign reviewer, Request Information, Forward, Accept, or Alternate plan. It includes intuitive icons.
Match, Memory, Error
0.00 Positive feature
25 Check Referral
View Referral Details
The patient information section is large and could be broken down further to more easily identify patient information.
Memory 1.67
Patient information intuitively reorganized.
26 Check Referral
View Referral Details
The referral comment list link at the top of the page is redundant since the section has its own link to view all comments.
Minimalist 1.00 Upper right hand links removed
87
27 Check Referral
View Referral Details
The referral priority level (or a statement that a priority has not been assigned) is in bold above the referral details.
Feedback 0.67 Positive feature
28 Check Referral
View Referral Details
There is no page title when viewing referral details.
Consistency, Visibility
2.00 Title added to referral details page
29 Check Referral
View referral documents
Document viewer is extremely small and thus ineffective for viewing referral documents.
Consistency
3.33
N/A - Difficult to mock-up and test.
30 Check Referral
View referral documents
Return to referral (back) links on the top right of the screen are too small.
Visibility 2.00 Yes
31 Check Referral
View referral documents
The view PDF function, which launches the native PDF viewer (i.e. Adobe Reader, Foxit Reader, etc.), should be made more prominent.
Consistency, Visibility
2.00 Explain why not incl.
32 Notify Physician
Assign Reviewer
In order to minimize patient's wait for an appointment, it is desirable to assign the patient to the next available physician. The system is not integrated with the scheduling system and there is no way for the user to determine the first available appointment slot.
Match 3.67
Available appointment slots added to accept screen.
33 Notify Physician
Assign Reviewer
A "View Referral Details" link at the bottom of the page
Minimalist, Memory
0.67 Positive feature
88
decollapses or collapses a view of the referral details.
34 Notify Physician
Assign Reviewer
Instructions are somewhat redundant and verbose
Minimalist, Language, Document
1.67
Instructions removed where appropriate.
35 Notify Physician
Assign Reviewer
The "Select Reviewer" dropdown list is alphabetical by last name, but names are displayed as [first] [last]. This increases the difficulty in navigating to the correct reviewer.
Match, Error
3.00
Names displayed by last name, then first.
36 Notify Physician
Confirmation of Reviewer Assignment
Confirmation of reviewer assignment is explicit and concise.
Feedback, Closure
0.00 Positive feature
37 Notify Physician
Confirmation of Reviewer Assignment
The confirmation could be reworded to second person.
Language 1.00 Minor issue
38 Notify Physician
Confirmation of Reviewer Assignment
There is no intuitive link to leave the confirmation screen and proceed to the next referral without utilizing one of the left hand menu links to view the full list of referrals.
Match, Memory
3.33
"Next Referral" button added.
39 Notify Physician
Under Review
There is not notification sent to a physician when a new referral has been assigned to them.
Match, Error
4.00
N/A - Difficult to mock-up and test.
40 Request More Information
Request More Information
Each cancer site group has typical clinical documentation they require with a referral
Match, Minimalist
0.00 Positive feature
89
prior to accepting the patient. This screen provides a standard list of documents by site group.
41 Request More Information
Request More Information
When information has been requested, the referral still displays under the "Under Review" category, rather than a separate "Information Requested" category.
Match, Memory
3.33
Created "Waiting for Information" category.
42 Request More Information
Request More Information
No automatic alert or notification is generated when the additional information has been received for a particular referral. It is up to the physician or their administrator to manually check the referral for any new information.
Feedback, Error
3.67
N/A - Difficult to mock-up and test.
43 Request More Information
Request More Information
A "View Referral Details" link at the bottom of the page decollapses or collapses a view of the referral details.
Minimalist, Memory
0.67 Positive feature
44 Request More Information
Request More Information
Instructions are somewhat redundant and verbose
Minimalist, Language, Document
1.33 Minor issue
45 Request More Information
Confirmation of request
Confirmation of reviewer assignment is explicit and concise.
Feedback, Closure
0.00 Positive feature
46 Request More Information
Confirmation of request
The confirmation could be reworded to second person.
Language 1.00 Minor issue
90
47 Request More Information
Fax - Request more info
"Associated with previous referral" instructions are provided to assist the user with assigning follow up documentation with an existing referral, but the instructions are not clear.
Error, Document
2.33
N/A - Not part of rad onc workflow
48 Request More Information
Fax - Request more info
The field to enter the fax name (description) is misaligned with the body of text.
Consistency
1.00
N/A - Not part of rad onc workflow
49 Request More Information
Fax - Request more info
Once the fax has been assigned to an existing referral, there is no direct link to continue working on that referral. The user must return to the Home screen and select that referral from the "Under Review" list.
Match, Minimalist
3.00
N/A - Not part of rad onc workflow
50 Forward Referral
Select internal or external
A "View Referral Details" link at the bottom of the page decollapses or collapses a view of the referral details.
Minimalist, Memory
0.67 Positive feature
51 Forward Referral
Select internal or external
Instructions are somewhat redundant and verbose
Minimalist, Language, Document
1.33
Instructions removed where appropriate.
52 Forward Referral
Select internal or external
Only two links are on this page: Forward internally and Forward Externally. These two links are misaligned and make poor use of screen space.
Consistency
1.33 Links realigned
91
53 Forward Referral
Select internal or external
The "external" option is listed first, while "internal" is more likely to be selected.
Match, Error
2.00 Internal link moved to top
54 Forward Referral
Select internal or external
The internal and external forwarding links do not have associated pictures or icons to differentiate them.
Match, Memory
1.00 Large icons added to links.
55 Forward Referral
Select internal or external
There is not enough emphasis on "SickKids" (PMH) and "external provider" to differentiate between forwarding internally and externally.
Consistency, Error
2.00 Large icons added to links.
56 Forward Referral
Forward referral externally
A "View Referral Details" link at the bottom of the page decollapses or collapses a view of the referral details.
Minimalist, Memory
0.00 Positive feature
57 Forward Referral
Forward referral externally
Canadian medical directory link appears ambiguous.
Consistency, Match, Memory
1.33 Minor issue
58 Forward Referral
Forward referral externally
Instructions are somewhat redundant and verbose
Minimalist, Language, Document
1.33
N/A - Not part of rad onc workflow
59 Forward Referral
Forward referral externally
Poor layout and use of screen space.
Consistency
1.67 Content realigned
60 Forward Referral
Forward referral externally
Provider could potentially utilize a default value.
Memory, Flexibility
1.33
N/A - Not part of rad onc workflow
61 Forward Referral
Forward referral externally
The cancel button only goes back one screen and does not cancel the forward process (or go back to the referral details
Match, Memory, Undo
2.67
N/A - Not part of rad onc workflow
92
screen).
62 Forward Referral
Forward referral externally
When forwarding a referral to an external physician, correspondence is automatically faxed to that physician. This correspondence cannot be viewed by the user.
Consistency
1.33
N/A - Not part of rad onc workflow
63 Forward Referral
Forward Internally
"Problem" is one of the field entries when forwarding a referral internally. The problem was already selected when the referral was submitted. Although this may have changed after the specialist's review, the referring professional's diagnosis should also be displayed.
Memory 2.33 N/A - not tested
64 Forward Referral
Forward referral internally
A "View Referral Details" link at the bottom of the page decollapses or collapses a view of the referral details.
Minimalist, Memory
0.67 Positive feature
65 Forward Referral
Forward referral internally
Instructions are somewhat redundant and verbose
Minimalist, Language, Document
1.33
Instructions removed where appropriate.
66 Forward Referral
Forward referral internally
No cancel button is visible until after a clinic and condition have been selected.
Error, Closure, Undo
2.67 "Back" button
67 Forward Referral
Forward referral
Poor layout and use of screen space.
Consistency
1.33 Content realigned
93
internally
68 Forward Referral
Forward referral internally
The "can't forward to" clinic dropdown list is redundant.
Minimalist 2.00 Dropdown removed
69 Forward Referral
Forward referral internally
The clinic information is displayed when it has been selected.
Visibility, Match, Feedback
0.00 Positive feature
70 Forward Referral
Forward referral internally
The displayed clinic info is removed when a patient "problem" is selected.
Visibility, Match, Feedback
2.00
N/A - Difficult to mock-up and test.
71 Forward Referral
Forward referral internally
The instructions for forwarding a referral are obstructed.
Consistency, Visibility
2.00 Content realigned
72 Forward Referral
Forward referral internally
The list of clinics to forward to should be at the top of the screen and not below the left hand menu.
Consistency, Visibility
1.67 Content realigned
73 Forward Referral
Forward referral internally
The referral guidelines are clearly stated when a problem is selected. This includes highlighting the exclusion criteria in red.
Visibility, Match, Feedback
0.00 Positive feature
74 Forward Referral
Forward referral internally
The selected clinic should be more clearly highlighted.
Visibility, Closure
1.67
Clinic selection made prominent
75 Forward Referral
Confirmation of forwarded referral
Confirmation of reviewer assignment is explicit and concise.
Feedback, Closure
0.00 Positive feature
76 Forward Referral
Confirmation of forwarded referral
Poor screen layout - the confirmation message is misaligned with the box.
Consistency
1.67 Content realigned
77 Forward Referral
Confirmation of forwarded
The confirmation could be reworded to second person.
Language 0.67 Positive feature
94
referral
78 Forward Referral
Confirmation of forwarded referral
The confirmation screen was good and provided closure.
Closure 0.00 Positive feature
79 Accept Referral
Accept Referral
The numbers for the priority levels could be misleading since they do not match the associated time intervals for when a patient must be seen.
Match, Error, Language
3.00 Priorities removed
80 Accept Referral
Accept Referral
"Referral Source" and "Referral Type" were already selected when the referral was submitted. These fields are redundant and the information should not need to be re-entered.
Minimalist 2.67 Fields removed
81 Accept Referral
Accept Referral
A "View Referral Details" link at the bottom of the page decollapses or collapses a view of the referral details.
Minimalist, Memory
0.67 Positive feature
82 Accept Referral
Accept Referral
Assign booking task boxes are misaligned.
Consistency
1.33
N/A - Not part of rad onc workflow
83 Accept Referral
Accept Referral
Book before date is misaligned.
Consistency
1.33 Content realigned
84 Accept Referral
Accept Referral
Error messages are obstructed by the sub header bar.
Visibility, Message
2.67
N/A - Difficult to mock-up and test.
85 Accept Referral
Accept Referral
Instructions are somewhat redundant and verbose.
Minimalist, Language, Document
1.33 Minor issue
86 Accept Referral
Accept Referral
Referral source and type drop down lists
Visibility, Minimalist
1.33 N/A - Not part of rad
95
only have two options. Consider using radio buttons.
onc workflow
87 Accept Referral
Accept Referral
The "book after date" (earliest date at which patient should be seen) instructions are verbose and unclear.
Minimalist, Feedback, Language
2.00 Instructions reworded
88 Accept Referral
Accept Referral
The default "book after" date is set to the current date.
Memory, Flexibility
0.67 Positive feature
89 Accept Referral
Accept Referral
The priority level assignments across and down in pairs (1x2 in a single column). This makes it easy to misread the levels (e.g. 2a vs. 2b). An inaccurate priority could result in an added delay to scheduling a high priority patient.
Error 3.33 Priorities removed
90 Accept Referral
Accept Referral
"Assign Booking Task" section should be removed. Once a referral has been accepted by a physician, the referral should automatically return to the administrator who assigned the referral to that physician. That administrator would then be responsible for booking the appointment.
Minimalist 3.00 Section removed
91 Accept Referral
Accept Referral
The error message is obscured by header text.
Feedback, Message
2.33
N/A - Difficult to mock-up and test.
96
92 Accept Referral
Confirmation of accepting a referral
The confirmation message is obstructed by the sub header.
Visibility 2.33 Content realigned
93 Reject Referral
Alternate Plan
A "View Referral Details" link at the bottom of the page decollapses or collapses a view of the referral details.
Minimalist, Memory
0.00 Positive feature
94 Reject Referral
Alternate Plan
Instructions are somewhat redundant and verbose.
Minimalist, Language, Document
1.33 min
95 Reject Referral
Alternate Plan
Documents can be preloaded for each referral rejection reason.
Minimalist, Flexibility
0.00 Positive feature
96 Reject Referral
Fax confirmation of alternate plan
Confirmation of reviewer assignment is explicit and concise.
Feedback, Closure
0.00 Positive feature
97 Reject Referral
Fax confirmation of alternate plan
The confirmation could be reworded to second person.
Language 0.67 Positive feature
98 Submit New Referral
Create Referral
The system attempted to validate the referral against existing referrals by searching first and last name. However, it failed to search on DOB.
Match 3.00
N/A - Not part of rad onc workflow
99 Submit New Referral
Refer to clinic
Screen instructions are obstructed.
Visibility 2.33
N/A - Not part of rad onc workflow
100
Submit New Referral
Display referral guidelines
Guidelines for referrals for a specific problem to a particular clinic are
Visibility 0.00
N/A - Not part of rad onc workflow
97
viewable as the user fills in the referral form.
101
Submit New Referral
Fill medical data
As the required information is filled, there is no indication as to which portions have been filled, and which sections are left (non-medical data, medical data, attachments). This is partly resolved by the "Review & Submit" page.
Visibility, Match, Feedback, Error
1.67
N/A - Not part of rad onc workflow
102
Submit New Referral
Fill medical data
Instructions are somewhat redundant and verbose.
Minimalist, Language, Document
1.33
N/A - Not part of rad onc workflow
103
Submit New Referral
Fill medical data
The add attachments button link is not clearly identified in the bottom row of buttons. Since almost every referral will required attached documentation, the link should be easily identifiable or it could accidentally be missed.
Visibility, Match, Closure
2.67
N/A - Not part of rad onc workflow
104
Submit New Referral
Fill medical data
The user has the ability to lock the medical data with a check box to prevent future users from altering the patient's medical data for the given referral
Error 0.00
N/A - Not part of rad onc workflow
98
105
Submit New Referral
Fill non-medical data
A red error message appears at the top of the page when the user attempts to submit the form with an incorrect format for MRN which must be seven digits. This message is unclear and states that it must be typed as "ddddddd" where d is a number. There was no instruction to indicate that the number should have been 7 digits long.
Visibility, Feedback, Message
3.00
N/A - Not part of rad onc workflow
106
Submit New Referral
Fill non-medical data
Error message could be stated in second person.
Language 0.67
N/A - Not part of rad onc workflow
107
Submit New Referral
Fill non-medical data
Instructions are somewhat redundant and verbose.
Minimalist, Language, Document
1.33
N/A - Not part of rad onc workflow
108
Submit New Referral
Fill non-medical data
Patient information can be automatically filled by typing in the MRN# and clicking the corresponding link to get patient information from the EMR.
Flexibility, Error
0.00
N/A - Not part of rad onc workflow
109
Submit New Referral
Fill non-medical data
The "Referral Guidelines" button shows/hides the guidelines for submitting a referral.
Minimalist, Memory, Feedback
1.00
N/A - Not part of rad onc workflow
110
Submit New Referral
Fill non-medical data
The date example provided was in numbers (i.e. "2012-12-26"). "YYYY-MM-DD" would be clearer.
Consistency, Memory
1.67
N/A - Not part of rad onc workflow
99
111
Submit New Referral
Fill non-medical data
The lack of digit grouping with dashes or spaces in the phone number can make it difficult to read.
Memory, Error
1.67
N/A - Not part of rad onc workflow
112
Submit New Referral
Fill non-medical data
The patient information form input is broken apart into sections.
Consistency, Error
0.00
N/A - Not part of rad onc workflow
113
Submit New Referral
Fill non-medical data
The required input for phone number input is explicit (no dashes or spaces)
Memory, Error
0.33
N/A - Not part of rad onc workflow
114
Submit New Referral
Fill non-medical data
The required input for postal code input (case, spacing) is not explicit.
Memory, Error
1.33
N/A - Not part of rad onc workflow
115
Submit New Referral
Attach files
"Cancel" button is ambiguous. From the attachments page, it takes the user back to the "medical data" page; however, it could be interpreted as cancelling the entire referral.
Match, Memory, Undo
2.33
N/A - Not part of rad onc workflow
116
Submit New Referral
Attach files
Files must be uploaded one at a time. The user cannot concurrently upload multiple files.
Flexibility, Control
2.33
N/A - Not part of rad onc workflow
117
Submit New Referral
Attach files
Only TIF or PDF files are allowed to be uploaded, but the file selection dialog show all file types.
Memory, Error
3.00
N/A - Not part of rad onc workflow
118
Submit New Referral
Review & Submit
Instructions are somewhat redundant and verbose.
Minimalist, Language, Document
1.33
N/A - Not part of rad onc workflow
100
119
Submit New Referral
Review & Submit
The inputted text appears in greyed out input fields rather than plain text. This makes the review page difficult to read, and may also mislead users into thinking that they should be able to directly edit data on this page.
Consistency, Visibility, Control
2.00
N/A - Not part of rad onc workflow
120
Submit New Referral
Confirmation of submitted referral
Confirmation of reviewer assignment is explicit and concise.
Feedback, Closure
0.00
N/A - Not part of rad onc workflow
121
Submit New Referral
Confirmation of submitted referral
The confirmation could be reworded to second person.
Language 0.67
N/A - Not part of rad onc workflow
122
Submit New Referral
General
It feels like the user is continuously being told what to do, or what they have done incorrectly. There is a perceived lack of control over the system.
Closure, Language, Control
2.00
N/A - Not part of rad onc workflow
123
Submit New Referral
General The date input format is explicitly identified.
Memory, Error
0.00
N/A - Not part of rad onc workflow
124
Submit New Referral
Submit new fax referral
Referring professional information cannot be automatically recalled and assigned to the referral through a unique identifier such as a fax number.
Minimalist, Memory
2.67
N/A - Not part of rad onc workflow
125
Submit New Referral
Submit new fax referral
The MRN must be manually entered by the user, but the system could
Minimalist, Memory
1.67
N/A - Not part of rad onc workflow
101
automatically fill or search for the MRN based on first and last name.
126
Submit New Referral
Process Fax 1
TIFF/PDF viewer is extremely small and not practical for viewing documents. The user must click on the "PDF" link to open the PDF in Acrobat Reader.
Consistency
3.33
N/A - Not part of rad onc workflow
127
Submit New Referral
Process Fax 1
A significant number of faxes sent to UHN are spam. The "Non-Referral" button allows the user to easily identify spam faxes which are discarded from the list.
Flexibility 0.00
N/A - Not part of rad onc workflow
128
Submit New Referral
Process Fax 1
When viewing the fax, the user must click on the "Referral Fax" button AND an additional "Referral Fax" link. This is an additional redundant click.
Minimalist 2.00
N/A - Not part of rad onc workflow
129
Submit New Referral
View Fax
PDF viewer is small and the complimentary details are displayed below.
Consistency
3.00
N/A - Not part of rad onc workflow
130
Submit New Referral
View Faxes
When viewing faxes, they are listed by chronological order. There is no categorization for the different faxes sent (new referral, or follow up to an existing referral). No filtering options are
Visibility, Memory
3.00
N/A - Not part of rad onc workflow
102
available.
131
Submit New Referral
View Faxes
Follow up faxes display the referral ID, but not the patient name. Administrators will have to search for the appropriate referral by MRN rather than recalling it by name.
Match, Feedback
2.67
N/A - Not part of rad onc workflow
132
Book Appointment
Book Appointment
The booking staff instructions - Book before date, and Assign booking task - are separated and should be placed together.
Consistency
1.67
N/A - Not part of rad onc workflow
133
Book Appointment
Book Appointment
"Pre-clinic tests" checkboxes are defined by the referral site group.
Minimalist 0.00
N/A - Not part of rad onc workflow
134
Book Appointment
Book Appointment
Although the time format is stated as "HH:MM", it is still unclear whether it should be input in 12 hour or 24 hour format.
Memory, Error
2.33
N/A - Not part of rad onc workflow
135
Book Appointment
Book Appointment
Book/defer/assign links appear to open separate pages, but they are showing/hiding the respective panel. When 1 of the 3 options is selected, the other 2 are not visible and a cancel button must be used to go back.
Match, Undo, Control
2.33
N/A - Not part of rad onc workflow
103
136
Book Appointment
Book Appointment
MRN can be entered on this screen if it was previously blank, but it does not recall what was previously entered.
Memory, Flexibility
2.67
N/A - Not part of rad onc workflow
137
Book Appointment
Book Appointment
The "Assign sub-clinic" link should be designated as a reassignment.
Match, Language
2.00
N/A - Not part of rad onc workflow
138
Book Appointment
Book Appointment
The "Edit" book before link takes the user to the "referral accepted" screen. Canceling from this screen takes the user all the way back to the "view details" screen. Saving changes takes the user to a confirmation screen. There is no way to return to the booking screen.
Undo, Control
3.00
N/A - Not part of rad onc workflow
139
Book Appointment
Book Appointment
The screen space is poorly utilized with a large blank space next to the left hand menu, and all of the page elements below.
Consistency
1.67
N/A - Not part of rad onc workflow
140
Book Appointment
Book Appointment
There is currently a link to edit the "Book before" date (i.e. priority). Since booking will be handled by administrative staff, they should not have the ability to change the priority which has been assigned by the physician.
Match, Error
2.67
N/A - Not part of rad onc workflow
104
141
Book Appointment
Book Appointment
Booking tasks will not be done in ARMS . The current booking system (PHS) cannot integrate with ARMS and appointments must be manually entered in PHS. There is no method to define physician schedules in ARMS.
Match, Memory
2.67
N/A - Not part of rad onc workflow
142
Book Appointment
Book Appointment
The "Edit Information" link and "Select Subclinic" links both take the user to the same "Assign sub-clinic" page
Minimalist 1.67
N/A - Not part of rad onc workflow
143
Book Appointment
Book Appointment
The patient MRN must be entered manually, even though it has already been entered when the referral was first submitted.
Minimalist, Memory, Flexibility
2.33
N/A - Not part of rad onc workflow
144
Book Appointment
Book Appointment
An incorrectly entered MRN generates an error message which is unclear and not displayed next to the respective field.
Feedback 3.00
N/A - Not part of rad onc workflow
145
Book Appointment
Assign Sub-Clinic
This page asks requires that the referral be assigned to a subclinic. This does not reflect the process at PMH. A physician from a specific site group (clinic) has already accepted the patient and the patient does not to be assigned to
Match 3.00
N/A - Not part of rad onc workflow
105
a sub-clinic. (No such subgroup exists at PMH)
146
Book Appointment
Confirmation of booked appointment
Confirmation of reviewer assignment is explicit and concise.
Feedback, Closure
0.00
N/A - Not part of rad onc workflow
147
Book Appointment
Confirmation of booked appointment
The confirmation could be reworded to second person.
Language 0.67
N/A - Not part of rad onc workflow
148
Book Appointment
Confirmation of booked appointment
The system delay reasons are unclear and should be included prior to confirmation of booking.
Match, Memory
2.00
N/A - Not part of rad onc workflow
149
Book Appointment
Confirmation of booked appointment
Once the appointment has been booked in the system, there is no intuitive way to go back to the home screen, or automatically proceed to the next referral.
Memory, Control
3.00
N/A - Not part of rad onc workflow
150
Book Appointment
Confirmation of booked appointment
System Delay reasons are unclear. This section is intended to explain unavoidable appointment delays. This section is only necessary if the appointment is being rescheduled and after every appointment booking confirmation.
Match, Minimalist
2.33
N/A - Not part of rad onc workflow
106
151
Book Appointment
Book out of Window
This screen is displayed when the patient is booked outside of the recommended window (based on severity). A link is displayed to rebook the patient at an appropriate date. A date picker on this page would eliminate a click.
Minimalist 2.00
N/A - Not part of rad onc workflow
152
Book Appointment
Book out of Window
The "Return to Booking Screen" and "Reschedule Appointment" links both take the user to the same screen.
Minimalist 1.67
N/A - Not part of rad onc workflow
153
Book Appointment
Booked Referrals
Once the appointment is booked, the user should not be able to assign a sub-clinic, or assign a reviewer.
Match, Error
1.67
N/A - Not part of rad onc workflow
154
Book Appointment
Wait Time Reasons
The wait time reasons link is hidden in the upper right hand of the screen. Missing this link could result in inaccurate wait time calculations.
Consistency
2.00
N/A - Not part of rad onc workflow
155
Book Appointment
Wait Time Reasons
The standard text captions only display for five seconds, which is not enough time for the user to read the wait time instructions.
Visibility 2.33
N/A - Not part of rad onc workflow
156
Book Appointment
Wait Time Reasons
"Dates affecting readiness to consult:" is immediately followed by the reason and then the
Language 0.33
N/A - Not part of rad onc workflow
107
date range.
157
Book Appointment
Wait Time Reasons
The "Return to Accept" link in the upper right of the screen takes the user back to the initial "Accept Referral" screen. This link should not be there - the appointment has already been booked.
Match, Error
3.00
N/A - Not part of rad onc workflow
158
View Help Help Task oriented and embedded in contents.
Document 0.00 Positive feature
159
View Help Help The help section is a PDF which makes it difficult to search.
Flexibility, Document
2.33
N/A - Difficult to mock-up and test.
160
General General
Administrative links should be less prominent than the functional links in the left hand menu.
Visibility, Match
1.67 Menu reorganized
161
General General
All date input on all forms can be completed by using a pop-up calendar link next to the input field.
Memory, Flexibility, Error
0.67 Positive feature
162
General General
Although hovering over a left menu option causes that menu row to go dark, only the text is hyperlinked; the rest of the darkened row is not.
Flexibility, Control
1.00 Minor issue
163
General General
Fonts on most screens are too small. The font size should be bigger to be more legible and make
Consistency
1.67 Font size increased.
108
better use of the available screen space.
164
General General
Hovering over a left menu option causes that menu row to darken, which assists in menu selection prior to committing to a link.
Visibility, Memory, Feedback
0.00 Positive feature
165
General General
Information/Instruction should be made more minimalist. An "info" icon could be used where appropriate for a more minimalist approach.
Minimalist, Memory, Feedback, Flexibility
2.00
Instructions reworded and removed where appropriate
166
General General
It appears that there are "back" and "home" navigation buttons on the upper right hand side of the screen on most pages. However, they are misaligned (too high) and thus obscured by the referral search tool.
Consistency, Visibility
2.33 Links removed
167
General General
None of the screens indicate who is logged in to the system. This could potentially result in referrals being sent/received by the wrong system user.
Visibility, Error, Control
3.33
Physician name on home screen
168
General General
On all screens which require the user to submit form entry, error messages appear in red whenever an
Feedback, Message, Error, Language
0.00 Positive feature
109
incomplete submission is attempted by the user.
169
General General
On most screens, there is no clear "back" button. The "cancel" button directs the user to the previous screen and should be labelled accordingly.
Match, Memory, Undo
3.00 "Cancel" changed to "Back"
170
General General
The "ARMS utility" link should be better labelled since it is functionally useful to the user.
Language 1.67
N/A - Not part of rad onc workflow
171
General General
The "Refer to clinic" link in the left hand menu should be placed near the top of the list.
Visibility, Match
1.67 Menu reorganized
172
General General
The "Select clinic" link in the left hand menu should be placed near the top of the list.
Visibility, Match
1.67 Menu reorganized
173
General General
The left hand menu is consistently on every page and easily identifiable.
Visibility, Minimalist, Flexibility
0.00 Positive feature
174
General General
The left hand menu lacks a hierarchical structure which makes some of the options seem ambiguous and the menu harder to navigate.
Match, Minimalist, Memory, Error
1.67 Menu reorganized
175
General General
The left hand menu links should be broken apart into groups.
Consistency, Visibility
1.67 Menu reorganized
110
176
General General
The page header title is small relative to the rest of the screen. The font size could be increased to make it more prominent.
Visibility 1.33 Minor issue
177
General Search
Only 1 of 2 referrals appeared when searching for referrals by the first name. Both referrals were displayed when searching by last name.
Error 3.33
N/A - Difficult to mock-up and test.
178
General Select Clinic
The "Select Clinic" dropdown only displays the clinics to which the user belongs.
Language, Error
1.00
User automatically logged into their clinic. Clinic selection screen removed.
179
General View Audit Trail
Referral edits can be tracked, but not referral views, resulting in an incomplete audit trail.
Match 3.33
N/A - Difficult to mock-up and test.
180
General View Audit Trail
The date format in the audit trail is not explicitly stated as yyyy-mm-dd, or yyyy-dd-mm.
Memory 2.33 N/A - not tested
111
9 Appendix C: Interviews
9.1 Interview Instrument
Interviews will last approximately 1 hour. The first half is adapted from Kushinka, S.A. [40].
What is the referral process?
What are the tasks or steps involved?
How long does each task take?
What are the variations to these processes? What are the acceptable reasons for
process variations?
Who completes the processes? Do several types of staff perform the same tasks? Is this
a good example of cross-training, or is it a duplication of effort?
Where are the bottlenecks where the process gets interrupted or slowed?
Has some staff member already found a way around such points?
Do some tasks need to be done more than once in a given process? (for example, must
the same data be entered at different points?)
Are there places where the process regularly stalls? (for example, in getting information
from one staff member to another?)
The second half of the interview is adapted from Holden [39]. Participants will be asked the
following questions to elicit their perception of perceived facilitators and barriers to e-referral
system adoption.
What factors or circumstances would enable you to use an e-referral system?
What factors or circumstances would make it difficult or impossible for you to use an e-
referral system?
Are there any other issues that come to mind when you think about being able to or not
being able to use an e-referral system?
Interview transcriptions will be analyzed for references to factors or conditions which facilitate
or inhibit e-referral system adoption. Transcribed interviews will be broken apart into individual
112
statements and coded based on the identifiable themes or subthemes. Coded statements will
then be grouped into their major respective themes.
9.2 Interview Themes – Supporting Statements
Table 6: An e-referral system should effectively supplement or substitute the various modes
of communication utilized by physicians and administrators
Speaker Statement
Participant 1: “…many different ways people get me. You can be paged, you can be phoned,
you can be asked through many different ways.”
Participant 2: “So it needs to be flexible, it needs to be easy to use, and I mean honestly,
ideally, the more options there are. You know that some doctor's offices don’t
have computers? I mean, ironically some doctor's offices don't have fax
machines, which is… Now I know that in a number of years fax machines will
be obsolete, right? Nobody's going to use them, but at the moment, there are
different levels of sophistication… [It] will have to be very user friendly and
you know... if you can indicate where the tests are from rather than having to
append a lot.”
Participant 2: “The other issue is, obviously confidentiality issues, because if we're loading
things and sending them beyond the institution walls… how am I going to be
able to send it electronically without worrying about patient confidentiality
issues?”
Participant 3: “So when she [the physician’s administrator] sees that it's a very straight
forward diagnosis, she will just assign an appointment, she'll have to put it
under the desk… on the desk of the individual that would be seeing the
patient, so I don't see them all, but my three radiation colleagues. So if it's
assigned to one of them, they will have to sign off on it. In other words, agree
to see the patient before the appointment goes back.”
113
Participant 4: “...it requires a clear line of communication and an efficient line of
communication and an accurate transmittal of transmission of information
both ways. So, if all that is in place, then you have... that is my idea of what
referral process should be.”
Participant 4: “I mean there's a couple of problems with e-systems. One is the system goes
down, everything goes down, though I suppose the paper back up would
always be available. Then you've got issues of confidentiality, but neither of
these problems is particularly insurmountable.”
Table 7: Verbal communication between a referring and receiving physician is the most
effective mode of communication for referrals, and is absolutely necessary for urgent cases
Speaker Statement
Participant 1: “The only thing I would say about e-referrals is that there always must be a
statement that if it's an urgent or emergency, that it must be accompanied by
a phone call or direct physician to physician talk.
Participant 1: “…otherwise, in any form of 'e', paper, fax, anything, there's a possibility of it
going wrong, and if there is a direct physician to physician communication,
that we've said, if it's something urgent, call me and I'll go and see it.”
Participant 1: “You know, doesn't have to be you, can be one of your staff, you know, but
I'll go. Sometimes they forget that and send it through a paper format, and
you just cringe because it's been there for two days.”
Participant 3: “Yeah, so there's occasionally situations… urgent situations where the
surgeon will call me directly. Either because, you know they're very reticent
to get a biopsy, it's a very critical location of the brain, so they'll speak to me
personally, in which case we'll make a decision together and then they'll get
the secretary to fax the appropriate information to my secretary if no surgery
is to be done. Then all we need is the notes and the imaging, or they'll go
ahead and do the surgery and then send us the thing in the routine we have
114
afterwards.”
Participant 3: “Or if we've done surgery, but it's urgent and they need to be seen this week,
then we expedite it. So we speak, and then they get their secretary to fax
over the necessary documents, or else the patient will come by ambulance
from their hospital with everything that day, or the next day.”
Participant 4: “…if it's ambiguous what the question is, or who should be seeing the patient,
then I will have to personally call back the physician and get clarification.”
Participant 4: “…for urgent cases it's actually fairly typical, particularly within the
organization, maybe not so much without the organization, that physicians to
physician contact will take place and all the paper work will get done
afterwards. So yes, for truly urgent stuff, there is physician to physician
contact.”
Table 8: Physicians do not want a system that will take more time than the current process,
but may be willing to if it is more useful
Speaker Statement
Participant 1: “What I would like is, that instead of churning through all the paper and stuff,
that it was formatted in a way electronically for the different site groups, that
I could actually access the information quickly and easily, online, and approve
it online. And instead of waiting on a Tuesday, because if it comes in on a
Tuesday to my desk, it probably won't get signed off until Wednesday.
Whereas if it's e-referral, and it's on an email, or in some type of a central
server type system, I would go in and approve it, um every, all the time. I'd go
in and do it.”
Participant 2: (The e-referral process) “It's dealing in the ideal world that we have lots of
time and we can assemble all sorts of information and then send it all
together. And in actual fact people are busy, information is fragmented, the
doctor is busy and is going to ask the secretary to do that, or a nurse maybe.
115
The nurse doesn't have all the information. So, if you're expecting the doctor
to do it, it may not happen for a few days until they have a quiet afternoon or
something. So, there has to be flexibility.
Participant 2: “…right now, most, many referrals come by fax. So they get faxed to a certain
place, for example referral registration office. Then they get picked up from
there and brought to the department I think twice a day, or three times a day
in our case. Then they sit on my secretary's desk until she deals with the mail.
Then they come to me. So, you know, you're losing, you know, probably six
hour increments at every step.”
Participant 2: “As long as it's very quick, and I do not have to log on. If you want me to log
on and enter a lot of information, I'm not going to be for it.”
Participant 2: “Familiarity, you know, it would need to have good support for use, and also
that you see the benefit. So you know if it's clear that there are some
features that are really beneficial, then I would be a little bit more willing to
spend the little bit extra time. So if I can track it and actually know who's desk
it's at, literally.
Participant 2: I am worried that it's going to demand more time and that it won't be
flexible.
Participant 3: “You know, it's better than dealing with paper if you have a reliable web
based… thing, then obviously you could also… the advantage of that is that
you've got a lot of data electronically available to you. It'd be nice, however
it's set up, that we could data mine it down the road.
Participant 3: “But I would think that, you know, knowing internally how a lot of stuff that
used to be paper based has gone electronic internally, to be quite honest, it's
probably more time consuming then quickly filling out a paper form and
getting your secretary to fax off the referral…because you know, it's time
consuming to enter data on a computer where you can just whip it down and
obviously.”
Participant 3: “…paper sometimes gets lost or you know gets misplaced or this that and the
116
other so that's a bit, it's a good thing, but I suspect that it won't get adopted
very quickly on the outside, because it's probably easier just to stick with the
paper.”
Participant 3: “…the big advantage of e-referral is that it's online, you can just look it up and
you don't have to have the paper. Because right now you need the paper and
it has to be delivered, and a lot of extra man stuff. But in fact, if you know, if
it's on the web and you just go there, no matter where you are you can look
it up, I think that's advantageous. It's just paper gets misplaced or this that
and the other.”
Participant 4: “Well it would have to be, it would have to be reliable and user friendly,
basically. I think that would be the only two features that I would be
interested in.”
Table 9: There is currently no way to audit the multiple referral handoffs that occur
Speaker Statement
Participant 1: “While I'm actually very good, genuinely good at signing off and getting stuff
seen, other people can sit on their desk for a day or two. And being able then
for me as an administrator to track and to, being able to tracking, it involves,
it's really like barcoding every step of the way when you have an e-referral
system, because you know where all the stuff is landing, at any particular
moment in time.”
Participant 2: “…once the fax is sent, you don't know who is it sent to, who's dealing with it,
kind of… it's in a vacuum for a little while.”
Participant 4: “The sort of mysterious route that the paper takes through the department…
well, I mean there are all sorts of potential bottlenecks. I mean, basically,
paper is a really bad way of doing this, as I'm sure you would understand.”
Participant 4: “The system allows for these pieces of paper to come on my desk as they
arrive, so typically it shouldn't take more than a few hours. But that's
117
between the piece of paper lying on my desk and me making some sort of
judgment. How long it takes for the paper to get on my desk, I haven't a
clue…”
Participant 4: “There are vastly more handoffs then are required to do this efficiently.
Because ultimately, the most efficient system is simply the referring doctor
speaking to me and me speaking to the referring doctor. I mean, that's... you
only need two people that ultimately need to make these decisions. Everyone
else is just sort of facilitating the flow of paper and information.”
Table 10: Physicians desire an integrated experience when accessing clinical information from
multiple sources and systems
Speaker Statement
Participant 1: “Occasionally, because of, certainly in an urgent referral, I may actually want
to look at the x-rays, and stuff before I look at, look at the urgent, you know
the urgency, particularly if it's an internal referral. For example, if it's an
urgent emergency type situation, for somebody with like spinal cord
compression or something like that, I'll go on, look at the images, and try to
decide, you know, yeah, I'll it today, tomorrow, it can wait till tomorrow, it
can be okay on Friday. So, for a small proportion of them, they would need
me to actually review the material in much greater depth before I ascertain
what I'm actually going to do with it.”
Participant 2: “So there are frequently cases that… their care is fragmented. So they maybe
went to a one specialist, and the scan was at another hospital. Then they saw
a third specialist at a third hospital, something else happened at a fourth
hospital. And then somebody's referring them.”
118
Participant 2: “I also request consultations, right, So I'm busy. All that I can do is fill in one
form and that give a summary of what is requested, and then ideally, if the
person is in the same institution as me, all of that information is going to be
on the computer. But if they're not in the same institution, or if the
information I have is from somewhere else, then I have to think - what do I
have to send to them? I have to stop and think, and make sure that all of that
is available. So I'm busy and people who are part of my team and I delegate
these things to are often not aware of all the details, and they don't often
stop and think what will the doctor need. All that they're focused on is there
is a request to see, you know, some specialist, who do I fax it to. That's all
that they're focused on. Everybody kind of sees one part of the process and
people stop to think about the full picture.”
Participant 3: So, the way it's set up - you know, my main site is brain tumors, and so I deal
mainly with a very limited number referrers, and that's basically neurosurgery
offices. Mainly at the Toronto Western, St. Michael's Hospital, as well as
Trillium. So three major adult neurosurgery groups who know to... the
current regime is to fax in the request - there's a set sheet that they can
download from the website - fax it in to my secretary with supporting
documents. Basically we need the operative notes, the clinical notes, and we
need the pathology report. And somehow getting the CD with the patient's
imaging to us and sometimes they mail it, or courier it, or they give it to the
patient to bring on the same day of the appointment.
Participant 3: “So the e-booking I don't think will, e-referral's not going to solve that issue,
but I think there are other mechanisms in place. Because that's the biggest
bug bearer. If I had to choose one thing I would want the imaging over
anything else.”
Participant 4: “We need to ensure that all the information that is already exists on that
particular problem accompanies the patient in some fashion so that we don't
have to delay searching for the information, or worse still duplicate things
119
which have already been done.”
Participant 4: If there is insufficient information on the document, then that will precipitate
a request for more information which will then possibly lead to an
appointment taking place. Or, it may, from the information, indicate that this
has been sent to the wrong individual, in which case I will, if it's obvious, I
will, what the correct individual is, I will send it back, send it to the
appropriate individual.
Participant 4: You need some description of the problem. You need pathology reports, if
they're available. You need imaging reports, if they're available… It's also
helpful, if those things are not available, that it's stated as such so that we're
not left in the dark as to whether it's been neglected to be sent or whether it
doesn't exist. So having a checklist of what's required or not, what's not
required is obviously a very sensible thing to have, which we don't have so
much at the moment.”
Table 11: Ubiquitous electronic health records would simplify the sharing of medical
information, documentation and imaging
Speaker Statement
Participant 2: One thing that would solve that, obviously, is if we're seamlessly connected
with the other healthcare providers and institutions.
Participant 3: So obviously at the Toronto Western's kind of special, because all of the
imaging, all the documents are in house, and so occasionally stuff's done by
email, occasionally for our colleagues at the Western.
Participant 3: “The big thing for us is not so much the paper. Paper or electronic, I don't
have a problem one way or the other it works. Fax machines really solved
that problem many years ago. The big issue is the imaging. And that
generally, there's no good reliable method other than sending a CD. And that
of course is no different than having films. CD's can get way laid, they cost
120
money to send them overnight, you know. Often patients are more reliable,
but then occasionally they forget the CD at home when we see them. You
may not be aware, but there's as part of the eHealth in Ontario, we're on the
verge of most of the Toronto Hospitals being linked up to one server so
imaging in any one hospital will be available to other in the other sort of
Toronto Hospitals. So that's within 6-12 months of coming, and that'll simplify
things. Often it's the imaging that's the sticking point because it's a CD, you
can't fax a CD.
Participant 4: What will make life very very much easier I would think is when the various
electronic records in the region are integrated and accessible. Then almost all
the information that is required, or available will become instantly available.
So I think the integration of the electronic medical records is part and parcel
with this whole issue.
Table 12: An integrated scheduling system would simplify the appointment booking process
Speaker Statement
Participant 1: Not in, not in our GU referrals, no. There's, um, it's a very collegial working
environment in that if I'm very busy, I willjust ask one of my colleagues to see
the case instead, and I don't have enough clinic time, or I just, or I'm going to
be away and I know I'm going to be away, and he doesn't know I'm going to
be away and is allocated only three cases, but there's never a problem about
making, getting the patients in.
Participant 2: So, the… It used to be that if a referral came asking for a specific doctor, it
would just sit and wait for that doctor. Now, most groups are organized by
groups. So if they're asking such and such doctor, and the doctor is away,
either the doctor covering for them will look at it and take it to a colleague or
take it themselves, or there is actual triage on the base of the group. So
ideally, if a request comes to the lung radiation group, it's easier to deal with
121
it than if it comes specifically such and such doctor.
Participant 2: But yes, most of the time, if I'm away, somebody looks at it. Or if, for most
people, if they're away, somebody looks away. But how frequent is that
depends on how busy we are.
Participant 3: The secretary here happens to be my secretary as site group leader, but she
will then... We have four rad oncs who sees patients, so she will then allot the
allocations to available appointments, but if there's not available
appointments within 7-10 days, she'll put it under my nose and I'll either ask
a colleague to see an extra patient, or I'll see the extra patient. So that's how
we deal with it.
Participant 3: Yeah so, my secretary - because it's all… and I’m usually here, if not then
somebody else would be covering - my secretary is very good about… as soon
as she gets it, she will put it on someone's desk.
Participant 3: I'm you know, depending on what day, and depending if I'm in the clinic or
not or whatever, so some days it may sit for a day and I won't see it until the
end of the day. Most days, I'm in and out of the office, so that she knows to
put the referral right there (points to spot next to computer), not in the in
tray where it may die for weeks. So anything like that, the faxes go on the
desk and if... usually I triage them if there's any sort of an issue, but usually
within, I would say it's probably most of the time it's same day that the
referring doctor will get an appointment and on occasion the next day.
Participant 3: Yeah, so there's somebody assigned to cover, so whoever's covering me for
CNS, they would go to them to triage.
Participant 4: Well there's always someone sort of designated to deal with these, but they
may or may not be properly trained in what they're doing. They may not
understand things, that certain things are more urgent than other things and
you know. There's always a triaging and priortization that takes place and
things and if the person doesn't understand that, then...
122
123
10 Appendix D: Cognitive Walkthrough
10.1 Walkthrough Themes – Supporting Statements
Table 13: ARMS should better integrate with the other clinical information systems currently
in use
Speaker Statement
Participant 1: “In order for me to dispose of [referrals] in a quick fashion, it's very useful -
one of the things I find irritating is that I have to go in and look and how many
patients do I have in clinic that day, how many other new's do I have? So you
could have a list for the last... Everybody that I've, once I accept Joe Blow, and
he's in next Wednesday, next Tuesday's clinic, leave him on the list as he's
not under review anymore, he's booked. But I see him in there, because then
I'd say is, when Joe Edge comes in, I know I've already got two in there, you
know. So it's giving me some information of my, of where I am, of what I've
booked.”
Participant 1: “Also, to be booked, I want to know where they are. I mean, what's their…
where are they booked. Okay. Actually, I’m much more directive than most. I
actually put in the date and the time on my forms when I accept them.
Because I actually look at the clinic. I hate patients waiting for hours, alright.
So, I actually look and say, oh Jesus I'm really busy, I'll do it late morning, put
it in at eleven. Or I know this probably needs solving, so let's get him in at
eight-thirty, nine. So I know he waits, but life's tough. We may have the
morning to solve his problems. Because it's not just me, he may need the
other services of... radiology, we may need to send him for x-rays. I probably
take a more hands on approach to my practice than many of my colleagues
would. So you get different opinions on how well, you know, how well you
want to micro-manage it. I just find it easier. I find it makes my quality of life
better.”
Participant 1: “…put, what do you call it, when they're booked, I want to see the dates.
124
Because then I can say... I don't ever want to just accept someone.”
Participant 1: “…somewhere here, the things that I would do should be on here that are
very certain things such as pathology review, and… because that's something
that can be got underway before we see them.”
Participant 1: “Alright. So that for example, every testes cancer patient, because it's an
unusual cancer, it affects 800 men in Canada a year, I never even issue an
opinion on treatment until our experts have reviewed it, because it's
commonly misdiagnosed… not misdiagnosed, there's some types that have a
huge impact and a regular pathologist in say St. Mike's might only see one or
two a year, these guys are world experts. So, you know what I mean? So that
pathology, central pathology review, that's what in my practice, and a more
general level, there's stuff here that we should make into tick boxes that for,
that's customizable for site groups that allow that there's certain things that
they need essentially done to instruct and make it easy for them. Pathology
review though I would say would be for a lot of cases. Okay, and it could - I
don't know who would do it, but someone would have to input... In order to
order a pathology review you have to go into the EPR and input information,
it's like an order, but my secretary can do it. I know how to do it, it's just five
minutes.”
Participant 1: “Okay, so let me… that's wrong [priority descriptions]. The policy in RMP is
everything must be seen within two weeks for a referral, where medically
indicated, okay. And, I want something in here as to specific date and time,
because I may allocate it.”
Participant 1: “I like to do that all the time, ‘9 o'clock, Tuesday’.”
Participant 1: “No, but I'd want to be able to say is, you know, putting in a little, the little
thing you use when you're booking in Air Canada? Book, time, date, click.
Okay, and what would be really, really nice, is that is when I'm putting it in
there, that somehow it pops up at me what other new ones I have in there at
that time.”
125
Participant 1: “I'll see it Tuesday at nine, and then it says, it flashes, it comes into you and
says, you see all the new ones for that clinic, and you say, ‘Oh shit, I've
already got three in there,’ and then I'll go back and say I'll put it in the
afternoon.”
Participant 1: “The way I do it now is I go to my secretary and she has them all on my
Microsoft Outlook calendar, and that's what I use as my guide.”
Participant 1: “And the other thing is here, we should also give some thought to
administrative data that we may want to pick up. If it's more than two weeks,
that we've said, okay, why? Could be a free text or a drop down box. And it
could be medically indicated. Alright. You know, "Patient not available."
Because you know, I'll get referrals, ‘Patient not here for a month.’”
Participant 1: “We're counted at the Cancer Care Ontario as we… our aim is that 85% of our
referrals are seen within two weeks. And we're allowed to, well I don't know
we're allowed, but we take off the people who have just said that they're
going away. We can't be dinged for that, like it's silly, right. So you would
want to know on an administrative level, so and a more general point, you
would want to align some... when you're doing mockup, or you or someone
else, that they should be giving thought to what administrative data can be
easily picked up with this and aligning with our reporting both internally and
externally.”
Participant 2: “…let's say I have enough information, I need to know when is my next
available appointment, so they want radiation medicine head and neck, why
did it come to me, when is my next available appointment. Is it next Tuesday,
or the Tuesday after. And who else has appointments before then. So what is
my next available, and what is the group next available, and who is it with.”
Participant 2: “Then I can say this, you know, I'm available next Tuesday, my colleague is
available next Monday. One day doesn't make a difference; I'll see the patient
next Tuesday. Or, I'm away next Tuesday, my clinic this Thursday is really fully
booked, my colleague has an appointment this Friday, it goes to the
126
colleague.”
Participant 3: “So right now, typically our secretaries bring this to us and we see or else the
CD may be attached, and I'll see. More often, the newly diagnosed it's not a
problem, because they'll send whatever they have. Not uncommonly though,
sometimes they'll come with an MRI done in northern GTA to St. Mike's let's
say, and there they'll just do their own CT, because they've got the MRI, so
they'll do CT, just to help guide their navigation system. And then they'll have
the pathology, everything, and they'll send everything to us, and they'll just
say to the nursing staff, "send all the imaging." But of course the St. Mike's
imaging does not include the outside MRI, so nine times out of ten, we don't
get that outside MRI. And that's a recurring issue. Because the MRI... the CT
shows you something, but the MRI shows you way more, and so we need
that original MRI, and there's no good system right now. Because that MRI is
probably no longer in the physician's office, it's probably in the OR. And it sits.
Once they've done their case, they move on and that CD is lying around
somewhere, and nobody knows where the hell it is.”
Participant 3: “So invariably what we do is, we see the patient that game day, and what we
do is we just find out where was that MR done, and I get my secretary to get
it down here within a week. And that's exactly what I do now. But I don't
think we can do… We can't tell people how to run their own hospitals, right?”
Participant 3: “That's the commonest thing, is they just send the current image. And then
we can't really know what the hell is going on when there's been prior
imaging, and often they're sending us because there's been a change, but
they don't send us the prior imaging. They, you know, we, it's... The report
may tell you that, but still we need to see it with our own eyes to figure it out
as to what - which met has to be dealt with, that sort of thing. So the
commonest thing would be, ‘please send me the prior two MRIs, in addition
to the current one that's going to come,’ that sort of thing.”
127
Table 14: ARMS needs to be flexible in order to better support current practice in
accommodating potential referral pathways
Speaker Statement
Participant 1: “An auto fax… The other thing is you should of it within the onemail system,
the secure email system that we could start using emails for this. It doesn't
have to be faxed out. It has to be faxed if it goes beyond the secure, what we
call the onemail system. We could also think of doing it within the email
system. The other thing is, is there back here a phone number of the referring
professional.” (In reference to the automatic fax confirmation)
Participant 1: “Okay, so that's forward there and forward to another physician, yeah. Okay,
I would do that. But I also might want to forward it to my secretary to do
some stuff with it. Eleni, please call them. And, what do you call it, because, I
might not trust the auto fax if it's urgent, okay. Please call, tell them I want it
today. You know, there's some urgency to it, okay.”
Participant 1: “I might forward it to medicine, to the department of medical oncology,
occasionally and say look, I think, see this, you should take a look at it.”
Participant 1: “…it's a simple place to pick up the data, ‘why?’ [was the patient not seen on
time]. It could be patient's just post-operative. But for me to get that out as
an administrator, when I look and say, you know, 40% of the patients weren't
seen for more than two weeks, and the... you can say ‘well, there's actually
well, because’.” (points at screen)
Participant 2: “I might forward to a surgeon and would say you know, this patient, sounds
like they need a surgeon, I don't see that a surgeon has been requested, you
know. I sometimes like… say, what you know, if I knew this Dr. MacTaggart, I
would email Dr. MacTaggart directly. Got this referral, ‘Hey Mike, how are
things, got this referral from your office, what's up?’”
Participant 2: “I may forward it to another coordinator and say, you know, or to new
128
patient referrals, and somebody. But this is better right, I mean, you wouldn't
want it for everybody to see it, right, so I appreciate that this system is
better.”
Participant 2: “If an outside doctor refers the case that I don't know, I don't know if I myself
can pass it on. I would… I could decline, with an alternate plan suggested
would be "On reviewing the case, I think medical oncological consultation
would be more appropriate. Please let us know if you still need us.”
Participant 2: (Interviewer: “So then you would expect the primary care, or referring
physician to refer them to the other specialist?”) “Yes, yes.”
Participant 2: “As I say, what's available currently, and who is around, and who can… this
sounds like one person is booking for the whole group, which is not ideal
because we sometimes do it, but we always pass it on to that person to agree
and accept.”
Participant 3: “Rarely, extremely rare. It would mainly be internal.” (In reference to
forwarding a referral externally)
Participant 3: “So occasionally, rarely, but this has got to be less than one percent of the
time, we'll get a GP will send a case in. And where in fact the patient needs to
see a neurosurgeon first. So occasionally we'll defer it. Something like that.
But what I do then is I don't make the referral, I get back to the person and
say you need to refer to a neurosurgeon before it comes to me, and then I'll
give them names.”
Participant 4: “Well if for whatever reason I can't deal with it and I couldn't deal with it
because I'm going to be away next week or… but again I think that really also
speaks to how the referral process is organized within the department. I
mean, the way that it's organized here, and that's just the way it's done and
it's not necessarily the correct way, is that I look at all the referrals that come
in, and then I divvy it up as the site leader. And then I divvy it up among the
individual physicians.”
129
Participant 4: “…in fact happens is that my assistant fills in a schedule to the next available
person which you know, usually keeps the workload balanced and make sure
that the earliest possible appointment is given to each person. So in that
respect, I would be forwarding everything to my assistant who would then fill
in the boxes.”
Participant 4: “Some of my surgical colleagues consider a referral to them to be literally
that and they will keep that patient even if they can't see them for three
months, they probably wouldn't be forwarding anything. So I mean a forward
feature is important, and how it's used will depend on a process is set up for
an individual group.”
Participant 4: “I suspect that this will not replace physician to physician contact, for like the
truly urgent stuff. And sometimes a face to face discussion is required to
decide what is the best course of action anyways so this would supplement it,
but it wouldn't replace it.”
Participant 4: “Never happened to me, [rejecting a patient]. I mean if there is some doubt
about that it usually result in a phone call back to the referring physician for
clarification.”
Table 15: ARMS display of information and hyperinks should be optimized to enhance the
visibility of important links and information, and ease system navigability
Speaker Statement
Participant 1: “What I need to see – Charlie, who's dealing with all of the referrals… he may
need a broader list. I'm not interested in looking at his broader list… So if I
think of it as these are the ones that are referred to me, right. And I mean,
the only thing I'd like to see on top of that is, once it's received, urgent and...
under review is good.”
Participant 1: “An awful lot of the health card number and expiry and all of those sort of
things I have zero interest in seeing. I don't need to know it.”
130
Participant 1: “As long as they're here [supporting documents]… and I can see them.”
Participant 1: “For prostate I need to see their PSA, their Gleason score and stuff. And that
could be entered here terribly simply.”
Participant 1: “It also puts urgency in my mind. If you see a high score, high PSA, it means
that it's more urgent.”
Participant 1: “I think the more that you can prevent scrolling up and down, the more that
you can… constrict the information to one page and not have to scroll up and
down - scrolling up and down on laptops or computers - we're klutz’s, okay,
so that would be one thing. The other thing about e-referral, for some sites,
they need to give some thought as to: it's not just about - I've always said
that we should be able, technologically, to be able to upload x-rays and
images, alright. And you should have a place to review images, and then
that'll take you into somewhere.”
Participant 1: “…if it's a brain tumor, click, and then you're looking at an MRI or a CT of the
brain. They should be able to upload them… there should be a link to that.”
Participant 2: “I'm kind of wanting to see the referring professional sooner, but it's okay.
Because at this point, who is referring is a very critical part, because there are
contact information and they're the source of information… ‘Is telehealth
available to you’, it's confusing who ‘you’ is. Is ‘you’ the referring
professional, or is you me. So that's a little funny wording. It's a good
question, because... yes and no actually. If I'm going to do telehealth with
that patient, I don't care whether the... I don't know what the relevance is
that whether the doctor can do telehealth. Because I would do telehealth
with the patient. So I... telehealth is an interesting thing to include, but I am
confused with the wording.”
Participant 2: “What about up here, supporting medical information? What is the
difference between this and this?” (points at text information, and file
attachments)
131
Participant 2: “I guess important things that are not immediately seen: one is is it an
inpatient or an outpatient, and if inpatient, where and you know. And
secondly, I don't know when this was actually submitted. So presumably, I
have to look for it here, received on... that's kind of important.”
Participant 2: “So what would also be helpful is to indicate, because I'll have to search for it,
has the patient had tests, and at what hospitals. So I don't necessarily need to
see all the information, but if they could provide it, if they could say cat scan
at Mount Sinai, you know, then I can easily look at it.”
Participant 2: “Imaging. Imaging, biopsy, and sometimes blood work.” (With respect to
what must be attached to a referral)
Participant 2: “I expected more… like where would I see that information, the information
that I would actually need to see. Is there another case that has more
information?”
Participant 3: “The two, the two key things are date of birth, and not only date of birth, but
it'd be nice if it automatically spit out a number, gave you an age.”
Participant 3: “…most of the stuff I deal with, first of all, when they're sent in, often they've
not been fully worked up. First stage, that sort of thing, so that's typically we
undertake that when they come in here, or occasionally third stage. But you
know newly diagnosed or recurrent, or that sort of thing, or brain mets, or
something like that, that immediately clues you in into what you're dealing
with. So I think that's okay. I mean, you can only build in so much, and then it
just complicates things. You want to keep it simple and lean up here, and
then I get into more nitty gritty stuff here.”
Participant 3: “I don't know why it's there.” [Attachments at the bottom of the screen].
Participant 3: “…and then then the clinics you can't forward to. Why – I don't understand
why that's there.”
Participant 3: “So all this small print. ‘Click to cancel and return’, click and… So this should
be here, and that should be there,” (motioned that the bulk of the page
interactions should be at the top of the screen, rather than having to scroll
132
down when accepting a patient)
Participant 4: “What I don't like about this is all these pages that you've got to open. It
would be you know. It would be really nice if all this was just on a PDF and
then you just opened it and then you could just scroll down the pdf rather
than having to look at the referral form, look at the pathology, ok look at this.
Separate pages are a real pain in the ass. And I mean the one nice thing about
paper is, you've just got it in your hand and you can just riff through it. So if
you can figure out some way of riffing through an electronic thing that would
be good.”
Participant 4: “It needs to be a complete referral, and I need to make a decision on who
needs to deal with it. Then once that's done, once we've decided yes it's you
know, GU, next available, send it off. If it turns out to be a truly urgent
problem, then it will be sent to the emergent, duty doctor. The guy who deals
with the emergencies and urgent problems.”
Participant 4: “You really like tiny fonts.”
Participant 4: “How do I go back?”
Table 16: The radiation oncologist ARMS interface should better match their current practice
by removing erroneous links and information
Speaker Statement
Participant 3: “So I'm happy to forward it, that… I'm happy to do that. But right now, that's
not how we do. Because often, we just get paper, and we don't have
anything else. At that point I can see quite clearly it's coming from a family
doctor, and there's been no assessment by a neurosurgeon. Job one is the
patient needs to be assessed by a neurosurgeon because you know. So I will
just immediately fax back a quick thing, or call them and I'll say we can't see
this case without going through a neurosurgeon first. We want tissue
133
diagnosis, they probably need a resection. So that's the only time that we
need to turn it back. It needs to be sent somewhere else. So that's a
possibility where I could forward it to a surgeon, to see if... could you see this
patient.
Participant 3: So this is way more detail. They're trying to accommodate every possible
scenario, when in fact something much simpler could deal with 95% of the
request. And then I would rather than have all this incredible detail, I'd prefer
to have something very simple to handle 90% of the requests. Because
basically it's very straight forward, put into lung clinic next Thursday or
whatever. And then just have a, something more simple than another one
section where, in other words, where other scenarios. Now, it's a little more
straight forward in CNS, and it is more complicated in lung, or breast, or these
other tumor sites where you know, we don't have to stage 98% of our
patients. It's just, it's a tumor in the brain, and we don't have to do the rest of
the body. We don't have to do any of that stuff. So we're probably not the
best site to set this up, but I would like the default that you just, that you
could keep it simple, and you wouldn't have to go through all of this.
Certainly from the CNS site group is much more straight forward. We're very
fortunate that the game day we see them that day, we're making treatment
decisions, we don't have to wait for further investigations.
Participant 4: “Once I've accepted the patient, I would assume all the stuff happens behind
the scenes necessary to get the patient into the clinic is, according to
whatever guidelines we have for clinic booking, happens without me having
to check on it basically.”
Table 17: The system language should reflect the language used by radiation oncologists
Speaker Statement
Participant 2: “’New’, ‘under review’,’ to be booked’, ‘referrals to other clinics’. I don't
134
know what you mean by clinic. What's a clinic?”
Participant 2: “So I would prefer, just for simplicity… I don't know, I’m sure you're going to
have a lot of feedback about terminology etc., you know, you could say this
one's… "New, comma, in the process…" Who triages for example. Does the
secretary triage? Or does the site group leader triage? Or like, who triages,
you know? So I would actually think that the new ones are the ones to be
looked at, and the under review are already assigned, and already somebody
claimed them. So, first of all, which hat am I wearing? Am I wearing the hate
of a site group leader, or the triage doctor for the week? Or am I wearing a
hat of only my own personal practice, so that's one question. And then, you
know, I would actually think that these ones are the ones to be looked at
(New), not these ones (Under Review), but... so the terminology.”
Participant 2: “’Patient doesn't meet criteria.’ What criteria?” (In reference to “Alternate
Plan” radio buttons)
Participant 2: “But I don't like… this sounds very, um… rigid.” (In reference to “Alternate
Plan” radio buttons)
Participant 2: “So this is meaningless (hovering over the priority), because what I need to
know is first of all, what I need to know is am I accepting for me, or am I
accepting for my group. And I need to know when is the availability.”
Participant 2: “So I don't like this. This doesn't sound appropriate for cancer. I can say this
patient needs to be seen in less than 24 hours, but the real question is who
will see them. Where, and when… I can say it should be done soon, but then
the only person who can see within 24 hours is the person on call. They
review it and say no, this is not urgent, this can wait until tomorrow.”
Participant 3: “So there's all this stuff here, request information, so assign reviewer. In
other words, so I've been assigned, but I might reassign someone?”
Participant 4: “’New’, ‘under review’,’ to be booked’. So that doesn't make… What ‘new’ is,
I don't know what these other things mean. But it's pretty clear.
Participant 4: “So booked means booked into [what]?”
135
Participant 4: “No I think once you understand what the titles mean. I mean, that's, it's
fine.”
Participant 4: “Okay (skims over page). I wonder what that will mean (urgent), I mean
urgent means different things to people, right. Does that mean the patient
thinks it's urgent, or the physician thinks it's urgent?”
136
11 Appendix E: Usability Testing Protocol
11.1 Research student role
Coordinator
Setup laptop with Axure and Camstudio
Meet and greet participant at their office
Timestamping and note taking
Provide hints
11.2 Items to give to participant (prior to testing)
Time
Consent form
Reminder that the experiment will last for roughly 45 minutes
Explanation of study and participant requirements
11.3 Scenario set-up
Laptop will be loaded with e-referral mock-up pages
Cam studio will be loaded to capture screen and participant (audio and video)
11.4 Introduction to study
“The study you will be helping us with today will aim to investigate how an electronic referral
system can be improved for a radiation oncologist. We will be focusing on the referral review
process, which is currently done with paper based referrals. Based on my past data collection, I
have redesigned components of the ARMS e-referral system.
Today, you will be helping us to determine if the redesigned interface has any advantages over
the existing system. You will be introduced and trained on each system, and then asked to
review three patient referrals. As both e-referral systems mock-ups (not real), a few elements
may not be entirely realistic.
To standardize the study, you are a genitourinary radiation oncologist (Dr. John Dorian)
137
Many of the hyperlinks in the system will be inactive, since they do not pertain to the
referral review process, or the study being conducted today
Clinical information is generic, and high level; full imaging, pathology reports and clinical
notes will not be available, but it will be indicated if they are included, with relevant
findings.
While you are performing the tasks, I will be observing and taking notes. The session will be
audio and video taped to capture anything I miss. It is important that you “speak aloud” your
thoughts and actions as you work through the system. For example, if you click on a button
labeled “home,” make sure you say “I am clicking on the home button.” The audio and video
recordings will be kept strictly confidential. No identifying information will be used in any
reports, publications, or presentations.
11.5 Training
Login to the system
o Login: jdorian
o Password: pmh
(select clinic)
Home screen
Review referral
Triage toolbar
Accept
Forward
Request More Info
Alternative Plan (Decline)
11.6 Experiment
“Your name is John Dorian. You are a radiation oncologist who focuses on the genitourinary site
group. It is Tuesday, August 6, 2013, and you have just returned to your desk after lunch. Your
administrative secretary mentions that she has assigned you two new referrals, so you decide
to login to ARMS to review them. You generally try to accept all consults, but will refuse to do
138
so if no documentation is provided. You also like to suggest a convenient appointment time
when possible, but trust your administrative secretary to schedule your appointments.”
Reminder:
Low-risk: PSA < 10, Gleason score ≤ 6, AND clinical stage ≤ T2a
Intermediate-risk: PSA 10-20, Gleason score 7, OR clinical stage T2b/c
High-risk: PSA > 20, Gleason score ≥ 8, OR clinical stage ≥ T3
In fact, a third, updated referral will be included, to determine whether the participants
acknowledge that it must be reviewed.
11.7 Cases
Patient 1: Scott Summers (Complete)
High PSA and Gleason score, with MRI presenting a tumor on more than half of the prostate.
Patient 2: Erik Lensherr (More information required)
Elevated PSA, with no supporting documentation attached.
Patient 3: Hank McCoy (More information has been provided)
Elevated PSA and Gleason score, with imaging only just provided.
11.8 Usability and Usefulness Questionnaire
Adapted from Carayon et al. [30].
Respondents were asked the following questions immediately following each of the
observations (existing interface, and redesigned interface). Responses were based on a five
point likert scale ranging from strongly disagree, to strongly agree (unless otherwise stated).
1. Experience with computer based clinical information systems: very little (1) – very much (5)
2. Please circle the number that best reflects your acceptance of e-referrals: dislike very much
and don’t want to use (1) – like very much and eager to use (5)
3. Learning to operate the system: difficult (1) – easy (5)
4. Exploring new features by trial and error: difficult (1) – easy (5)
139
5. Remembering names and use of commands: difficult (1) – easy (5)
6. Tasks can be performed in a straightforward manner: never (1) – always (5)
7. Help messages on screen: unhelpful (1) – helpful (5)
8. Experienced and inexperienced users’ needs are taken into consideration: never (1) – always
(5)
9. Correcting your mistakes: difficult (1) – easy (5)
10. System is: difficult (1) – easy (5)
11. System is: frustrating (1) – satisfying (5)
12. Functions are as I expect: never (1) – always (5)
11.9 Usability Testing Preference and Performance Results
Table 18: Raw user survey results
Participant 1 2 3 4 5 Avg St.Dev %inc.
Date 25/6/13 25/6/13 27/6/13 9/7/13 15/7/13
Age 62 56
Interface Ex. Re. Ex. Re. Ex. Re. Ex. Re. Ex. Re.
Test A B B A A B B A A B
Comp. Exp. 4 5 3 5 4 4.2 0.8367
A 4 5 2 5 4 4 5 5 3 4 3.6 4.6 1.1 0.5 28%
B 4 4 3 5 4 5 4 5 3 4 3.6 4.6 0.5 0.4 28%
C 4 4 3 5 4 5 4 4 3 4 3.6 4.4 0.5 0.5 22%
D 3 4 3 5 3 5 4 5 2 4 3.0 4.6 0.7 0.4 53%
E 4 4 3 5 4 5 4 4 3 4 3.6 4.4 0.5 0.5 22%
F 3 4 3 - 4 5 3 3 3 3 3.2 3.8 0.4 0.9 17%
G 3 5 3 5 4 5 4 4 4 4 3.6 4.6 0.4 0.5 28%
H 4 5 3 5 - - 3 3 3 4 3.3 4.3 0.1 0.8 31%
I 4 5 3 5 4 5 3 4 3 4 3.4 4.6 0.4 0.5 35%
J 4 4 3 5 4 5 4 4 3 4 3.6 4.4 0.5 0.5 22%
K 4 4 3 5 4 5 3 4 3 3 3.4 4.2 0.4 0.8 24%
Usability testing Tasks
1 Login and Navigate Home
2 Accept 1st Referral
3 Confirm 1st Accept
140
4 Navigate to 2nd Referral
5 Decide to Request Info
6 Request More Info
7 Navigate to 3rd Referral
8 Accept 3rd Referral
9 Confirm 2nd Accept
10 Return Home
Table 19: Raw usability testing task times
Participant
P1 P2 P3 P4 P5 Average St. Dev.
Existing 1 20 12 34 27 18 22.2 8.5
2 135 57 124 40 54 82 44.0
3 69 60 42 13 81 53 26.5
4 63 41 40 7 14 33 22.6
5 33 37 27 29 33 31.8 3.9
6 53 7 9 9 7 17 20.1
7 23 17 27 31 25 24.6 5.2
8 62 26 52 28 45 42.6 15.5
9 35 19 24 7 48 26.6 15.6
10 7 7 16 10 6 9.2 4.1
Total 500 283 395 201 331 342 113.2
Redesign 1 8 4 16 8 3 7.8 5.1
2 114 31 66 75 31 63.4 34.6
3 55 30 33 29 12 31.8 15.4
4 4 3 5 6 2 4 1.6
5 21 42 19 30 13 25 11.3
6 38 12 6 17 7 16 13.1
7 7 11 4 10 2 6.8 3.8
8 35 55 27 49 41 41.4 11.1
9 90 29 28 12 27 37.2 30.3
10 7 5 4 6 6 5.6 1.1
Total 379 222 208 242 144 239 86.4