Designing the design phase of critical care devices: a cognitive approach

17
Designing the design phase of critical care devices: a cognitive approach Sameer Malhotra a, * , Archana Laxmisan a , Alla Keselman a , Jiajie Zhang b , Vimla L. Patel a a Laboratory of Decision Making and Cognition, Department of Biomedical Informatics, Columbia University, NY 10032, USA b School of Health Information Sciences, University of Texas Health Sciences Center at Houston, USA Received 13 October 2004 Available online 2 December 2004 Abstract In this study, we show how medical devices used for patient care can be made safer if various cognitive factors involved in patient management are taken into consideration during the design phase. The objective of this paper is to describe a methodology for obtaining insights into patient safety features—derived from investigations of institutional decision making—that could be incor- porated into medical devices by their designers. The design cycle of a product, be it a medical device, software, or any kind of equip- ment, is similar in concept, and course. Through a series of steps we obtained information related to medical errors and patient safety. These were then utilized to customize the generic device design cycle in ways that would improve the production of critical care devices. First, we provided individuals with different levels of expertise in the clinical, administrative, and engineering domains of a large hospital setting with hypothetical clinical scenarios, each of which described a medical error event involving health pro- fessionals and medical devices. Then, we asked our subjects to ‘‘think-aloud’’ as they read through each scenario. Using a set of questions as probes, we then asked our subjects to identify key errors and attribute them to various players. We recorded and tran- scribed the responses and conducted a cognitive task analysis of each scenario to identify different entities as ‘‘constant,’’ ‘‘partially modifiable,’’ or ‘‘modifiable.’’ We compared our subjectsÕ responses to the results of the task analysis and then mapped them to the modifiable entities. Lastly, we coded the relationships of these entities to the errors in medical devices. We propose that the incor- poration of these modifiable entities into the device design cycle could improve the device end product for better patient safety management. Ó 2005 Elsevier Inc. All rights reserved. Keywords: Decision making; Devices; Device engineering; Critical care; Patient safety; Task analysis; Institutional decision making 1. Introduction ‘‘DonÕt blame me for the article; blame the typewriter that printed it!’’—Anonymous. Even if the above statement were true, when the ques- tion of errors in clinical settings arises, assigning the blame does not help solve the problem. The individual with closest proximity (the operator) to the device most often bears the brunt of blame [1]. The critical care set- ting is a high-tension environment with a large number of users interacting with an even larger number of de- vices. Errors related to devices or users in the healthcare setup are drawing increased attention towards them as their recognition and reporting has improved [2,3]. We need to analyze these errors to devise measures that will help prevent them in future. The use of devices in medical care was introduced for many reasons, the primary ones being related patient monitoring and automation of procedures in order to save time and increase accuracy. The devices were not intended to replace human caregivers but to supplement their tasks. The effectiveness of these devices relied www.elsevier.com/locate/yjbin Journal of Biomedical Informatics 38 (2005) 34–50 1532-0464/$ - see front matter Ó 2005 Elsevier Inc. All rights reserved. doi:10.1016/j.jbi.2004.11.001 * Corresponding author. Fax: +1 212 305 3302. E-mail address: [email protected] (S. Malhotra).

Transcript of Designing the design phase of critical care devices: a cognitive approach

www.elsevier.com/locate/yjbin

Journal of Biomedical Informatics 38 (2005) 34–50

Designing the design phase of critical care devices:a cognitive approach

Sameer Malhotraa,*, Archana Laxmisana, Alla Keselmana, Jiajie Zhangb, Vimla L. Patela

a Laboratory of Decision Making and Cognition, Department of Biomedical Informatics, Columbia University, NY 10032, USAb School of Health Information Sciences, University of Texas Health Sciences Center at Houston, USA

Received 13 October 2004Available online 2 December 2004

Abstract

In this study, we show how medical devices used for patient care can be made safer if various cognitive factors involved in patientmanagement are taken into consideration during the design phase. The objective of this paper is to describe a methodology forobtaining insights into patient safety features—derived from investigations of institutional decision making—that could be incor-porated into medical devices by their designers. The design cycle of a product, be it a medical device, software, or any kind of equip-ment, is similar in concept, and course. Through a series of steps we obtained information related to medical errors and patientsafety. These were then utilized to customize the generic device design cycle in ways that would improve the production of criticalcare devices. First, we provided individuals with different levels of expertise in the clinical, administrative, and engineering domainsof a large hospital setting with hypothetical clinical scenarios, each of which described a medical error event involving health pro-fessionals and medical devices. Then, we asked our subjects to ‘‘think-aloud’’ as they read through each scenario. Using a set ofquestions as probes, we then asked our subjects to identify key errors and attribute them to various players. We recorded and tran-scribed the responses and conducted a cognitive task analysis of each scenario to identify different entities as ‘‘constant,’’ ‘‘partiallymodifiable,’’ or ‘‘modifiable.’’ We compared our subjects� responses to the results of the task analysis and then mapped them to themodifiable entities. Lastly, we coded the relationships of these entities to the errors in medical devices. We propose that the incor-poration of these modifiable entities into the device design cycle could improve the device end product for better patient safetymanagement.� 2005 Elsevier Inc. All rights reserved.

Keywords: Decision making; Devices; Device engineering; Critical care; Patient safety; Task analysis; Institutional decision making

1. Introduction

‘‘Don�t blame me for the article; blame the typewriter

that printed it!’’—Anonymous.

Even if the above statement were true, when the ques-tion of errors in clinical settings arises, assigning theblame does not help solve the problem. The individualwith closest proximity (the operator) to the device mostoften bears the brunt of blame [1]. The critical care set-

1532-0464/$ - see front matter � 2005 Elsevier Inc. All rights reserved.

doi:10.1016/j.jbi.2004.11.001

* Corresponding author. Fax: +1 212 305 3302.E-mail address: [email protected] (S. Malhotra).

ting is a high-tension environment with a large numberof users interacting with an even larger number of de-vices. Errors related to devices or users in the healthcaresetup are drawing increased attention towards them astheir recognition and reporting has improved [2,3]. Weneed to analyze these errors to devise measures that willhelp prevent them in future.

The use of devices in medical care was introduced formany reasons, the primary ones being related patientmonitoring and automation of procedures in order tosave time and increase accuracy. The devices were notintended to replace human caregivers but to supplementtheir tasks. The effectiveness of these devices relied

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 35

largely on how well the user operated them. The conceptof including patient safety measures in medical devicesslowly evolved as the impact of errors due to the impro-per design, implementation, and use, of medical devicesstarted being recognized. In addition to introducingmedical errors, the newly acquired devices raised otherissues, including the disruption of organizational cultureand concern among physicians regarding the changes intheir professional relationships and established work-flow routines [4].

1.1. Evolution of the medical device safety net

From a general standpoint, when a new device is in-vented, the primary concern at the time is to achieve thedesired functionality. With constant use, shortcomingsor possible improvements for the device become evident;with modifications, subsequent generations of the deviceevolve into much better contraptions. Similarly medicaldevices and instruments have evolved in functionality byincorporation of more and more features and automa-ticity. With development of more programmable andindependently operating devices, it became imperative

Fig. 1. Evolution of the device safety net. (A) Stage one: medical devices withsafety features limited to the device. Features such as inbuilt alerts and apatient safety features. Features that take into account the setting of operatioPatient safety administered by the device extended to the interactions betwe

that they not compromise patient safety in any way.Fig. 1 illustrates the ‘‘Evolution of the Patient SafetyNet,’’ delineating how different generations of medicaldevices evolved to provide safety along with their in-tended functionality.

The first generation of medical devices was patientsafety naı̈ve because their primary aim was to achievea certain functionality. The need for safety featureswas unrecognized until a medical error or error in themaking was observed. The earliest safety features in-cluded alarms, constraints, input confirmations andreconfirmations, but their scope was limited to theimmediate domain of device interface and operation.

Considering the fact that medical devices do not workin isolation, but interact with various other entities andpersonnel working in the same setting, the next evolu-tionary stage in terms of patient safety measures shouldaccount for these factors as well. From the time a clini-cian decides a plan of action to the actual execution ofthis plan, a number of cognitive processes and sequentialevents occur. The communication cascade triggered bythis situation is mostly concentrated around nursesand physicians [5]. Performing a cognitive task analysis

no patient safety features. (B) Stage two: medical devices with patientlarms were included. (C) Stage three: medical devices with extendedn and the involved workflow as well as the boundaries of human errors.en the various role players and their usage of medical devices.

36 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

of real clinical settings and understanding the peculiari-ties of the health care system can provide cues for newdesign principles in device development.

1.2. Collaborative decision making, devices, and errors

The Intensive Care Unit (ICU) is a unique and dy-namic setting where multiple individuals are involvedin the common process of providing both physical aswell as emotional care and support to critically ill pa-tients. From afar (or to an outsider), the workflow situ-ation is difficult to keep track off and, to some extent,may seem very disorganized, especially with all the buzz-ing alarms and the flashing display screens of medicalequipment adding to the confusion. The high-risk pa-tients admitted to the ICU have a number of medicalcomplications that require rigorous monitoring, inter-ventions, and an array of medications to stabilize them.Consulting physicians, attending physicians, residentsand nurses are all involved in patient care decision mak-ing, each of them possessing specialized and sometimesoverlapping knowledge [6]. In the face of the apparentchaos the team works together in a coordinated wayand relies on various sources of information to carryout its tasks [7]. In addition, sophisticated patient caretechnology is omnipresent in contemporary health careassisting health care providers with monitoring andtreating the patient. The use of such technology, how-ever, also causes an additional cognitive load for itsusers in terms of device operation, and this at timescan disturb the delicate equilibrium of the collaborativedecision-making process. This gives us many reasons forrefuting the traditional approach of blaming the clini-cian or the nurse alone for medical errors and for view-ing medical errors as a complex interplay of manyindividual, organizational, situational, and technologi-cal factors [8].

2. Background and theoretical framework

The last 100 years have seen the most dramatic suc-cesses in the field of medicine. From the discovery ofEhrlich�s magic bullets for treating infections to the gen-esis of highly techno-centric equipment for diagnosisand treatment, the improved longevity of human lifecould possibly be the single most important outcome.But when the same equipment becomes responsible formajor or even minor compromises in patient health, itbecomes a matter of grave concern. The medical errorbody of research emerged in the early 1990s, with land-mark studies conducted by Lucian Leape and DavidBates [9,10] and supported by the Agency for HealthCare Policy and Research, now the Agency for Health-care Research and Quality (AHRQ) [11]. Their work in-volved the identification and evaluation of system

failures that underlie adverse drug events and poten-tially adverse drug events. The most common defectswere found in systems meant for disseminating drugknowledge and for timely access to patient records.They concluded that changes made to the systems to im-prove the dissemination and display of drug and patientdata would make drug-related errors less likely [12,13].

But the problem of medical errors still did not get theamount of attention that it deserved. Then, in Novem-ber 1999, the Institute of Medicine (IOM) report titledTo Err Is Human: Building A Safer Health System wasreleased, focusing a great deal of attention on the issueof medical errors and patient safety [14]. Two large stud-ies, one conducted in Colorado and Utah and the otherin New York, found that adverse events occurred in 2.9and 3.7% of hospitalizations, respectively. When extrap-olated to the approximate 33.6 million admissions to UShospitals in 1997, the results of the study in Coloradoand Utah implied that at least 44,000 Americans dieeach year as a result of medical errors. The combinedgoal of the recommendations was to create sufficientpressure in the external environment to make errorscostly to health care organizations and providers, sothat they would be compelled to take action to improvepatient safety. It also emphasized the need to enhanceknowledge and tools that improve patient safety andbreak down legal and cultural barriers that would im-pede improvements in safety. The impact following thearticle was extraordinary, and researchers began tostudy the nature of errors related to human, device,environmental, and socio-cultural factors in a detailedmanner.

In this study, we show how medical devices used forpatient care can be made safer if various cognitive fac-tors involved in patient management are taken into con-sideration during the design phase. Medical devicedesign is similar to the manufacture of any other prod-uct in that understanding the requirements and expecta-tions of the stakeholders is the first step. This knowledgeis then utilized for building a prototype, which is thensubjected to a series of evaluations and testing cycles.The feedback from these evaluations is used for modifi-cations and improvement, giving rise to the final prod-uct [15,16]. But ‘‘final’’ is misleading because theproduct continues to change as different versions of itare released over time, each one being an improvementover its predecessor.

A concept proposed by Patrick D. Fleck, president ofCooper Interaction Design [17] illustrates an interestingaspect of decision making in the product developmentcycle (not specifically for medical devices). A decision-making model for innovative product design called theOODA loop was described (see Fig. 2), its four mainsteps being observation, orientation, decision, andaction. The gist of the model is that if one cannot makeclear observations and form viable options, then the

Fig. 2. Decision-making model for innovative product design, OODA loop.

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 37

ability to make sound decisions and carry them outeffectively crumbles.

Applying the model to the domain of medical deviceengineering, we can appreciate that in a medical settingthe variables are far too many. The operation of a devicedepends on a complex network of communicationamong many role players, each with differing cognitive,and executive capabilities. Also, other factors of thehealth care setting which are apparently unrelated tothe focus of the product (e.g., administration, policies),affect its functioning in multiple ways.

The current status of medical device engineering isthat during the design phase it follows almost the samerules as those for other manufactured products. The dif-ference lies in the rigorous testing and evaluation that isperformed before a medical device is made available inthe market. This is necessary because these devices aredirectly or indirectly related to the health and life of apatient. To satisfy regulatory issues, most biomedicalsystems must have documentation to show that theywere managed, designed, built, tested, delivered, andused according to a planned, approved process [18]. Inthe US, the Food and Drug Administration (FDA)adopts an adversarial position. It actively regulates indi-vidual devices and drugs, assuming that new therapiesand products are unsafe and do not work until provenotherwise. This process is not considered completely failproof in preventing the release of unsafe therapies andproducts, but can easily create a bottleneck in the devel-opment process.

The health effects of this bottleneck have been quan-tified by comparing approval times in the US to ap-proval times in Europe. The approval of biotech drugsin Europe outpaces those in the US primarily becauseof different hierarchical and policy issues, although itis difficult to measure the effect on quality control [19].Despite our stringent measures, we do find faulty and

potentially hazardous medical devices installed in healthcare institutions in the US. To curb such occurrences,initiative needs to be taken to focus on the device designphase and to bring human and other situational factorsthat are unique to health care into consideration fromthe very beginning.

FDA data collected between 1985 and 1989 demon-strated that 45–50% of all device recalls stemmed frompoor product design. Furthermore, the FDA recognizesthat a poorly designed user interface can induce errorsand operating inefficiencies even when operated by awell-trained, competent user [20,21]. Poor device designalso leads to great economic waste as studies haveshown that making changes to device design after ship-ping is about 40 times more expensive than when per-formed at the prototype development stage [22]. Thisresearch discusses the methodology used to acquireinsight into these factors and their subsequent link todevice design.

Cognitive science has played an increasingly impor-tant role in researching the above mentioned factorssince its methods and theories illuminate different facetsof the design and implementation of information. Itprovides important insight into the nature of cognitiveprocesses involved in human–computer interaction andthereby improves the application of medical informa-tion systems by addressing the knowledge, memory,and strategies used in a variety of cognitive activities[23]. It also plays a role in characterizing and enhancinghuman performance by the formalized study of human–computer (or device) interaction using such methods ascognitive walkthroughs, task analyzes, and heuristicevaluations [24,26]. The purpose of a cognitive walk-through is to evaluate the cognitive processes of usersperforming a task. The method involves identifyingthe goals and the sequences of action required toaccomplish a given task. The method is intended to

38 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

identify potential usability problems that may impedethe successful completion of a task [27]. A cognitive taskanalysis is concerned with characterizing the decision-making and reasoning skills and the information-pro-cessing needs of subjects as they perform activities andtasks that require processing complex information[28,29]. Such analyzes have also been applied to thedesign of systems to create a better understanding ofhuman information needs in their development [23,29–33].

Heuristic evaluation is a usability inspection methodthat has been found to be a useful tool for medical de-vice evaluation [25]. It refers to a class of techniques inwhich the evaluators examine the interface of a devicefor usability issues. They walk through the interfaceand identify elements that violate usability heuristics.It can be applied to paper or electronic mock-ups orprototypes as well as to completely implemented de-signs. There are a few limitations to this technique:one is that it focuses on a single device or applicationand therefore may not identify problems that arise be-cause of the environment in which the device is to beused; another is that it does not indicate elements ofthe interface that correctly follow usability guidelines;nor does it reveal any major missing functionality.

The immediate environment of the device is the indi-vidual interacting with it. HCI is a special and importantarea of study. Human factors engineering is a disciplinethat seeks to design devices, software, and systems tomeet the needs, capabilities, and limitations of the users,rather than expecting the users to adapt to the design. Acomplete human factors engineering analysis for medi-cal devices or software systems includes four major com-ponents: user, function, task, and representationalanalyzes [34].

In conclusion, the importance of cognitive sciencemethods in patient safety research especially in examin-ing how cognitive factors affect performance and howinformatics methods can identify sources of errors andprovide interventions to reduce them [35] needs to beconsidered. It is time to recognize that cognitive factorsare especially important in understanding and promot-ing safe practices.

3. Methods

3.1. Participants

We selected a total of nine subjects for the study,including an anesthesiology and critical care specialist,two nurses, a physician�s assistant, an anesthesiologyresident, two biomedical engineers, and two administra-tors. We wanted the subjects to be representative of therange of professional groups that deal with device-related issues during device selection and purchase,

use, and maintenance. We identified these groups onthe basis of a previous study of the decision-making pro-cess in infusion pump selection, conducted by our re-search team (Keselman et al. [36]), which also includedadministrators, biomedical engineers, nurses, and physi-cians. The nine subjects chosen for this study had clini-cal, engineering, or administrative responsibilities andwere representative of the people responsible for makingdecisions about device use, maintenance and selection ina hospital. The subjects had varying levels of expertiseand came from different educational backgrounds asis generally found in a typical large-scale healthinstitution.

3.2. Materials

3.2.1. Scenarios

The materials used in the study included three sce-narios, each describing medical errors that involvedmedical devices. Our goal in designing the scenarioswas to create hypothetical situations based on realisticevents and frequently used devices with relatively com-plex user-interfaces. We also wanted the scenarios torepresent a variety of hospital settings and a represen-tative range of medical professionals interacting withdevices (as possible sources of human error). The sce-narios were developed on the basis of examples fromthe FDA�s medical device report files [18]. A clinicalconsultant assisted in the evaluation of the appropri-ateness of the scenarios with respect to the project�sobjectives.

A brief synopsis of each of the three scenarios is givenbelow. The complete scenarios have been described inAppendix A. The devices and the people potentially con-stituting the human component of error are listed inparentheses.

1. Scenario 1 (nitroglycerine, infusion pump, andnurse). A nurse is receiving multiple orders for multi-ple patients at the same time in a busy and stressfulemergency room setting. For one particular patientshe receives four drug orders in one measuring unit,and the fifth order for nitroglycerine infusion inanother unit of dosage. She ends up programmingthe nitroglycerine infusion pump with a dosage inunits similar to the other four orders. The patient getsoverdosed and experiences a dangerous fall in bloodpressure, but is rescued when the critical care teamdiscovers this.

2. Scenario 2 (oxygen, ventilator, and physician). In apediatric ICU setting, a patient is receiving oxygenthrough a ventilator. A resident comes and changesthe flow rate from one to one point five. He is una-ware of the fact that the device can only be put ondiscrete settings of one, two, three, etc. and doesnot operate when the dial is between these numbers.

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 39

Consequently, the child receives no oxygen at all. Theerror is discovered and the child is given higher flowrates and rescued.

3. Scenario 3 (heparin, infusion pump, and physician).A patient is receiving intravenous heparin via an infu-sion pump. A resident changes the dose delivered to ahigher rate and then is required to change it back tothe maintenance dose. In doing so he overlooks the�confirm� button for the dose change to maintenanceand the patient continues to receive the high dosage.The pump does keep beeping as an indicator but thepatient�s family members are the only ones to notice itand do not take any action. The nurse only discoversthe error the next morning.

3.2.2. Semi-structured interview

A set of five questions was designed to elicit a think-aloud protocol in response to each scenario. The ques-tions were designed with the purpose of eliciting subjects�assessment of the source and seriousness of the error,without prompting them to place blame on a specific indi-vidual. Response-specific probes and clarifications fol-lowed each of the five key questions. The key questionsand the rationale for each are given below.

Q1. Please provide a summary of the scenario (withoutlooking at it).[To assess the accuracy of the subject�s problemrepresentation.]

Q2. What were the causes for the error in thescenario?[To assess the subject�s perception of the source oferrors, without prompting identification of thehuman component.]

Q3. Please rank these causes on a scale from most seri-ous to the least serious.[To assess the subject�s perception of the relativeseriousness of various causes.]

Q4. Who do you think was responsible for theseerrors?[To assess the subject�s perception of the humancomponent (and attribution) of the errors.]

Q5. What steps could be taken to prevent these errors?[To assess perception of potential safeguards andtheir locations.]

4. Procedure

We provided the subjects with the scenarios (one byone) and asked each to answer a set of five questions.The questions were aimed at eliciting a think-aloud pro-tocol in a semi-structured fashion. The interviews wererecorded and later transcribed. The transcripts were

then parsed into idea segments. Each segment consistedof one stated cause of the error and the explanation forthat cause. A segment varied in length and could includea sentence clause, a sentence, or in some cases the mainsentence and one or two supporting sentences.

The interview data were then analyzed using a hierar-chical scheme of thematic codes (e.g., [37]). Some of thecodes were top-level categories, others were lower-levelcodes subsumed by the top-level codes. The develop-ment of the coding scheme involved a combination oftop–down and bottom–up approaches. Thematic codingthat is partly based on categories that are not prede-fined, but emerge in the course of data analysis, falls intothe theoretical framework of the grounded theory ap-proach to data analysis (e.g., [38]).

Grounded theory is ‘‘theory that was derived fromdata, systematically gathered and analyzed through re-search process.’’ This approach assumes a close relation-ship among the data, the analysis, and the theory thatemerges from the data. The researcher sets out to conductan investigation with a minimal amount of preconceivedhypotheses, allowing the data to tell the story from theparticipant�s perspective. The focus is on collecting richdescriptive data that cannot be easily subjected to statis-tical analysis. This research approach is appropriate forinvestigating complex topics in their naturalistic con-texts. It is concerned with ‘‘understanding behavior fromthe subject�s own frame of reference’’ (Bogdan and Biklen[39]). Given that medical care is a complex collaborativeprocess, with medical errors closely tied to the settings inwhich they occur, we felt that situating our work in thegrounded theory approach would help us do justice tothe nature of the process we set out to study. The combi-nation of theory-driven and data-driven approaches of-fered the advantage of being attuned to the richness ofnaturalistic qualitative data, while still relying on a preli-minary framework for maintaining objectivity.

We developed several coding categories based on thestudy objectives and the responses to interview questions(e.g., error involved, individual responsible, and severityof the error) and using the methodological approach ofgrounded theory (Corbin and Strauss). Other categoriesemerged from the empirical review of the data. A groupof three investigators reviewed a subset of protocols, cre-ating coding categories until reaching saturation (i.e., thepoint at which no new categories emerged). At thatpoint, coding categories were grouped and organized ina hierarchical fashion. These categories included errorinvolved, individual responsible, severity of the error,modifiability of the error, suggested solution to the error,and the error�s relationship to the device. Finally, threecoders jointly conducted two iterations of data coding,assigning codes to segments and resolving disagreementsthrough discussion. The final top- and lower-level cate-gories are discussed in the results section and the related(Tables 1–9) can be found in Appendix A.

40 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

5. Results and analysis

Studies in the HCI field are essentially composed oftwo elements: Individuals and Machines. We chose to di-vide our study into two sections and analyze the com-posite HCI entity from the Individuals and Devicesperspective. The Individuals� perspective focuses onhealth care professionals in the critical care setting andhow their perceptions and interpretations of an errordiffer as a function of expertise. This aspect of the studyis described in detail elsewhere (Laxmisan et al., 2004),but in summary the results show differences in error per-ception as a function of expertise as well as the nature ofthe task performed by the individual [40]. Errors per-taining to the critical events of scenarios were identifiedsuch that they fell under or were related to the area ofexpertise of the individual (e.g., A biomedical engineerperformed a much better identification of device relatederrors that were not picked up by the administrators). Inresearch along similar lines, Chung et al. [41] developeda methodology for predicting human error during oper-ation of a medical device by using techniques to evaluatethe interface and identify potential error-inducing fea-tures and steps.

This paper addresses the Devices perspective. The er-ror categories that were derived from the transcripts ofeach subject for each of the scenarios were further qual-ified based on modifiability and relation to medical de-vice (discussed later in Sections 5.2 and 5.3). The errorcategories were also ascribed a set of the following vari-ables: broad error category, individual implicated, mod-ifiability of the error, and relationship of error to themedical device. Those categories in which the valuesfor all four of these variables were unique were pooledtogether and treated as the final set. This set was thenused for deriving guidelines for the device design phase.

5.1. Coding scheme development

At the end of multiple rounds of coding, a total of 54unique error categories was identified (see Appendix A).These were then further assorted into six broad catego-ries. In cases of overlap where a single error categoryseemed to belong to more than one broad category,the assignment was made by reviewing the context ofthe error in the related transcript segment. The six cate-gories are given below.

� Administration-related categories. Factors responsiblefor the error that deal with the training and educationof staff, device in-servicing policies, device purchas-ing, and device retirement (five subcategories cameunder this category).

� Device-related categories. Those errors in which themedical device used in the scenario was held directlyresponsible for the mishap were considered to be

device related. Poor user interfaces, faulty design,and lack of inbuilt logic are examples of categoriesthat fall under this heading (15 subcategories cameunder this category).

� Situation-related categories. Situational factors andsurroundings that at the given time were hostile orindifferent to the workflow and users involved. (Onlyone subcategory—environmental stress/task over-load—was present).

� Policy-related categories. Errors occurring due to badprotocol, lack of standards, and vague workflow poli-cies are included in this category. One example of thiscategory are order format errors, which are due to lackof standards for units, mode (verbal vs. written), orexecution steps (15 sub-categories were policy related).

� User error. When the individual identified in the errorsetting is responsible for the error. User errors havespecific categories denoting carelessness, oversight,lack of knowledge, etc (15 sub-categories were indi-vidual/user related).

� Setup-related categories. This category includes errorscaused by the floor setting or geographical organiza-tion of devices, patients, and interface/alarms (Threesub-categories were related to the setup of the criticalcare environment).

5.2. Category division based on modifiability

When the error in question was amenable to modifi-cation and potential avoidance, we added an attribute ofmodifiability based on the following definitions:

� Modifiability.Yes (complete)—Indicates that the errorcategory/task in question has a narrower domain, hasno or partial human dependent interaction and is ame-nable to modification. The definition entails discrete-ness in terms of modifiability and does not implyperfection. Device-related errors mainly fall in thiscategory because a modification made to a device is adiscrete change, one that may not counter the error.

Example. Error subcategory. Device design flaw poorvisual interface is a totally machine-dependent entityand can be modified by the designers.

� Modifiability. No—indicates that the error category/task in question is of a broader domain or is beyondhuman control and modifiability.

Example. Error subcategory. Environmental stress/task overload is something that is not modifiable. Ser-vice cannot be denied in emergency health care settingseven if things are stressful for the health care providers.Such situations are not modifiable and are of a veryunpredictable nature.

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 41

� Modifiability. Partial—Indicates that the error cate-gory/task in question has significant human depen-dence, which makes the potential modification forimprovement dependent on the user in a certainway. Also includes factors that have a partial situa-tional or environmental component which is beyondmodification.

Example. Error subcategory. User error—Order inter-pretation is partially modifiable. Implementing standard-ized formats for order dispensing smoothes out theprocess, but there will always be a human element in thesituation. Despite the modified standard format, the per-son receiving the order may still interpret it wrongly.

Fig. 3 shows the distribution of the error categorieswith respect to modifiability. Of the total (54) error cat-egories, thirty-one were modifiable (57.4%). The largestcompletely modifiable error category pertained to themedical device-related errors (14). This was followedby policy and protocol. The largest partially modifiableerror category was the user related-error category (14).This was again expected based on our definition for par-tially modifiable errors; it also shows that a significantportion of errors is attributed to humans (27.7%).

5.3. Category division based on relation of the error to

the medical device

All three scenarios involved complex interactions be-tween the healthcare providers and the medical device inthe setting. Once the transcripts were coded, the extentof their relation to the medical device was then basedon the following definitions:

� Direct relationship:Yes—If the error takes place at thelevel of or directly upon the device during its operation.

Fig. 3. The six error categories subdivided into modifiable (Yes), partially mofrequency of error categories according to their modifiability characteristics.

Example. Device design—Bad visual display.

� Direct relationship. No—If the error in question wastotally unrelated to the presence of the medicaldevice.

Example. User error—Lack of knowledge. A dis-crepancy in the user�s clinical knowledge that contrib-utes to the medical error has no bearing on themedical device.

� Direct relationship: Partial—An error that islocated outside of the device domain but directlyaffects operation of the device. This definitionencompasses all error categories that fall betweenthe directly device related and device unrelatedcategories.

Example. User error—Mislabeling of the infusionpump. The nurse committed the error but because itwas done on the device it is classified as partially relatedto it. Such errors give possible insights on how improve-ments can be made to the device to prevent the userfrom making a similar error in the future. In this case,a future improvement could be conceived in terms ofthe device giving labeling options or prompts forrechecking the label (see Fig. 4).

The purpose of adding the device relationship attri-bute was to demonstrate how the medical device is thenucleus surrounded by all other entities and their re-lated errors. The errors of these entities may havecomplete, little, or no linkage/relationship to the med-ical device. This view gives perspective on what poten-tial error domains and cognitive factors outside of thedevice can be influenced and therefore taken intoaccount by its design; viz. an extension to its safetynet.

difiable (Partial) and Non-modifiable (No) errors. The graph shows the

Fig. 4. The six error categories and the mapping of their relation to the medical device in the scenario. Directly related (Yes), partially related(Partial) and unrelated (No) were used for mapping. The graph shows the frequency of the error categories based on their relation to the device.

42 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

5.4. Error categories of individual scenarios

The error categories identified by the subjects in thethree scenarios that are based on: (1) the nitroglycerineinfusion pump; (2) the oxygen ventilator machine; and(3) the heparin infusion pump are as follows:

Scenario 1. Nurse programming an infusion pump fornitroglycerinedelivery inhigh-tension critical care setting.

Fig. 5 shows the workflow and events occurring inScenario 1, where the nitroglycerine infusion pump

Fig. 5. Diagrammatic representation of the task and communication flow opump in a high-tension critical care setting.

was the medical device involved. Verbal orders for fivemedications were given to the nurse. One of the five or-ders differed in the medication units. The overworkednurse, who was attending to other patients at the sametime, programmed the pump for the patient with themedication doses mentioned by the doctor but withall five doses in the same units. Nitroglycerine wasthe medication that was consequently transfused tothe patient in the wrong amounts. The pump did nothave the internal logic necessary to recognize the over-

f Scenario 1 which involved programming of a nitroglycerine infusion

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 43

dose, nor did it provide any feedback regarding thenon-standard unit (assuming there are such standards).The overdose of nitroglycerine caused a dangerous fallin the patient�s blood pressure. Luckily the error wasrecognized and the patient was saved from permanentharm.

Of the total 54 error categories identified, 26 camefrom this scenario.

� Modifiable error categories. Out of the total 26 errorcategories 11 (42.3%) were identified as modifiable.Out of these 11 modifiable error categories eight wereconsidered to be in direct relation to the device andthree were partially/indirectly related.

� Partially modifiable error categories. Out of the total26 error categories 14 (53.8%) were identified as par-tially modifiable. Out of these 14 partially modifiableerror categories two were directly related to thedevice, eight were considered to be partially/indirectlyrelated, and four were considered to have no relationto the medical device.

� Non-modifiable error categories. Environmental stressand task overload were considered to be non-modifi-able entities with no direct relation to the medicaldevice, and it appeared once (3.9%)

Scenario 2: A physician alters oxygen flow throughthe ventilator in a pediatric critical care setting.

� Modifiable error categories. Out of the total 16 errorcategories 11 (68.75%) were identified as modifiable.Out of these 11 modifiable error categories five werein direct relation to the medical device and theother six were considered to be partially/indirectlyrelated.

Partially modifiable error categories. Out of the total16 error categories five (31.25%) were identified as par-tially modifiable. Out of these only two were partially/indirectly related to the medical device (ventilator) andthe other three were not related at all.

Scenario 3: A physician alters the dose of heparindelivered through an infusion pump and fails to hitthe confirm button.

� Modifiable error categories. Out of the total 31 errorcategories 15 (48.4%) were identified as modifiable.Out of these 15 modifiable error categories nine werein direct relation to the medical device and six wereconsidered to be partially/indirectly related.

� Partially modifiable error categories. Out of the total31 error categories 14 (45.1%) were identified as par-tially modifiable. Out of these 14 partially modifiableerror categories six were considered to be partially/in-directly related to the device and eight were consid-ered to have no relation.

� Environmental stress from task overload and errorsarising because of changing shifts were consideredto be non-modifiable entities with no direct relationto the medical device; each appeared once in theresponses (6.4%).

6. Redesigning the design phase

So far we have captured specific errors occurring incritical care scenarios and have qualified them accordingto their modifiability and their relationship to the med-ical device present. Based on the knowledge we gainedabout these errors we developed the insight necessaryfor making modifications to the medical device designcycle, such as the inclusion of pointers to the designerson how to consider the prospective setting for the de-vice, involved human factors, and probable error situa-tions. The ultimate objective is to use the cycle for theproduction of a well-conceived device prototype.

Before we suggest modifications to the device designcycle, we have to understand the concept of designand development planning as stipulated by the FDA.‘‘Design Control Guidance for Medical Device Manu-facturers,’’ published by the FDA in 1997, sets out qual-ity assurance practices to be used for the design ofmedical devices. By design controls they refer to aninterrelated set of practices and procedures that shouldbe incorporated into the design and development pro-cess, i.e., a system of checks and balances. The designcontrols are meant to assist the manufacturers in under-standing quality system requirements and to ensure thatthey address the intended use of the device based on theneeds of the user and the patient. Given that the cost tocorrect design errors is lower when errors are detectedearly in the design and development process [42], the sig-nificance of design control and the importance of a soliddesign foundation cannot be overstated.

Fig. 6 shows a generic medical device design cycle thatis based on industrial product development cycles and thewaterfall design process constructed by the Medical De-vices BureauofHealth inCanada [18].Reviewandvalida-tion occur at almost every step of the process. �Designinput� (or requirements) is an important starting pointfor the device designers, and essentially implies thatrequirements are given to the designers by stakeholdersand information obtained by the evaluation of existingdevices (for which improvements are forthcoming). Thedevice designers translate the requirements and eventu-ally generate a prototype. The prototype has to undergoa number of evaluations, including field-testing, beforeit is accepted as a final product. Human–computer engi-neering factors and other usability issues are all part ofthe design process and are continuously included and im-proved upon during the cyclic evaluations.

Fig. 6. Traditional medical device development cycle. Financialfactors are not being considered part of the cycle for this study.

44 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

In the cycle shown above the emphasis and discoveryof usability issues in the actual setting mainly lie in thefield-testing portion of the cycle. In our study, we ob-served errors in the critical care setting, their modifiabil-ity, and their relationship to the medical device involved.Drawing from those errors, we realize how important itis to consider various factors in the health care setting toachieve optimum patient safety. To bring in this next le-vel of safety into the device design, we need to supple-ment the requirements presented by the stakeholdersby adding the new health care setting inputs to the de-sign cycle. These additional inputs, termed Situational

or System research, can be divided into the followingcategories.

6.1. Users of the system

Humans have a limited functional spectrum beyondwhich the caution, vigilance, attention, and memory ofa person fail. No one can be a perfect worker at all timesor in all situations. Doctors and nurses are the primaryusers of medical devices in healthcare. Looking at the re-sults, we see that human factors contribute to a largenumber of errors and most of these are modifiable toa limited extent. Bringing user-related errors under theextended device safety net would primarily include useof cognitive artifacts, affordances, and external represen-tations [43]. Cognitive artifacts are human-made materi-als, devices, and systems that extend people�s abilities in

perceiving objects, encoding and retrieving informationfrom memory, and problem solving (Gillan and Schva-neveldt [44]).

Based on the error categories, the situational researchpertaining to users of the system can be split into the fol-lowing subcategories:

� Number of users. Healthcare is an excellent exampleof collaborative cognition [5]. Execution of a singletask may involve complex interactions and commu-nication among many users. As the number of indi-viduals and the intermediary interactions increase,the likelihood of loss or transmutation of informa-tion increases [23]. Apart from numbers, variationin the expertise hierarchy can further this informa-tion loss. Prompts, reconfirmations, and alerts arebuilt into devices to safeguard against this informa-tion loss, but having a uniform set of such featuresis not practical. Norman [45] argues that well-de-signed artifacts could reduce the need for users toremember large amounts of information, whereaspoorly designed artifacts increase the knowledgedemands on users and the burden on their workingmemory. A balance needs to be struck so that thesefeatures match the setting. Research on the typeand range of users possible for the prospective set-ting of the device needs to be completed and thentranslated into appropriate affordances and externalrepresentations.

� User authorization. Error categories that point atinappropriate users of the device were found in thescenario concerned with the ventilator machine (Sce-nario 2) as well as the one with the heparin infusionpump (Scenario 3). Authorization to make changesin device settings is an important issue in the healthcare setting where multiple users as well as non-users(visitors, family members) exist. Unqualified,untrained users attempting to change device settingscan lead to catastrophic events. Controlled authoriza-tion and secure operation of a device can be achievedwith the use of locks and constraints, which need tobe built in at critical locations. Due considerationshould be given to workflow issues as emphasized inthe previous paragraph.

6.2. Policies and protocols of the system

A standard set of policies and protocols, althoughdesirable, is not always possible. The tricky thingabout policies is that they are frequently updatedand changed. Our modifications to the design cyclestarted with the users and now extend to their specificinteractions and the policies that govern them. Basedon the error categories, the following subcategoriescan be used:

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 45

� Order format. Variations in the units or mode (writ-ten vs. verbal) of patient care orders were heldresponsible for errors in the provided scenarios. Stan-dardization of practices at the institutional level couldcurtail incidents related to these variations. Buildingon the partial relationship of these errors to the med-ical device, designers could include features that allowthe inclusion of logic, conversion, and alerts pertain-ing to orders into the device. Again this requiresresearch on the range of policies that are followedin health care settings.

� Inservicing, labeling, and installation protocols.

Knowledge about these factors is necessary to pro-vide affordances or built in reminders (regardingdevice maintenance) that will ensure smooth mainte-nance and operability of the device.

6.3. Administration and setup

Judging from the analysis of the scenarios, the impor-tance of appropriate accessibility of a device and its dis-play and sounding alerts is appreciated. Geographic and

Fig. 7. Medical device development cycle with consideration of additional fniche. The additional segment of Situational Research has been included wadministration and setup of the system. Apart from these specific device reqfound and included in the design. Contents in the dashed boxes are the sub

conceptual research of the workflow in different scales ofhealth institutions needs to be carried out. Lack of edu-cation and training about the device was another errorcategory that presented quite often. Apart from the ini-tial in-servicing and training provided about the deviceduring its installation, there needs to be a mechanismto maintain the continuity of the training, since its usersrepresent a dynamic population that will change all thetime. Multiple approaches can be carried out by the pro-vision of intuitive displays, external references, inbuilttraining sessions, etc. Again these can only be conceivedafter adequate research about the users, the administra-tion, and the policies of the system where the device is tobe placed.

The modified version of the device design cycle is de-picted in Fig. 7. Situational research generates addi-tional and more specific requirements for the devicethat are not as explicit or obvious as the initial input re-quired for designing the device. Device engineers followmost of these situational research-generated require-ments at one stage of the design cycle or the other. Manyof these requirements are only appreciated after field

actors and inputs in order to extend safety features beyond the devicehich generates knowledge pertaining to the users, policies, protocols,uirements which were not part of the initial requirements can also bedivisions of the entities touched upon by the situational research.

46 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

testing or after the initial evaluation of prototypes. Theidea behind formalizing the approach is to work on thissituational information in the beginning of the devicedesign and come up with a better prototype earlier on.Acquiring information about the users, the administra-tion, and the policies that permeate a system is butone task, and the translation of these into useful featuresof the device is another; the ultimate goal being a safetynet that goes beyond the physical margins of the device.

7. Conclusion

In the traditional sense �Creation� is meant to be anunbounded, unrestrained, and unguided process. Medi-cal device design is a creative process, but the end prod-uct has such a potentially grave impact on human lifethat it cannot be allowed to evolve in an unguided fash-ion. Market competition, rapid phasing out of devices,and the changing of technology utilized all contributeto the race towards creating newer and better devices.Poorly conceived prototypes are worked upon only tomeet compliance standards, and the essence of makingthe perfect device gets lost in the imposed urgency ofmoving the device from the drawing board to the shelf.The purpose of understanding the prospective settingsof device operation and the potential errors faced is toutilize the information in building a better and safer pro-totype. The designers play an active role in preventing orat least curtailing the effects of a medical error by build-ing patient safe equipment. A device design cycleemphasizing the above mentioned situational, adminis-trative, and human factors would act as a guide forthe designer to achieve these desired results.

Acknowledgments

Support for this research was provided in part byGrants G08 LM07676 from NLM and AAAA 2523from AHRQ. We express our gratitude to the subjectsfor taking time to be available for our research.

Appendix A.

A.1. Scenario one: nitroglycerine

In the ICU, a patient�s condition was deterioratingand multiple therapeutic interventions were beingmade at the same time. As part of this situation, fivemedication changes needed to be made immediatelyin order to reverse the patient�s condition. The nurse,receiving voice orders, was programming the infusionpump to administer one dose of nitroglycerine at10 cm3/h, and mcq/kg/min doses of four other medica-

tions. The patient experienced serious decrease inblood pressure a short while later as a result. Biomed-ical engineering stated that the pump was operatingadequately yet noted that the dose of nitroglycerineprogrammed in was 10 mg/kg/min; they also notedthat positioning of the pump in the patient�s roomwas awkward and not easily accessible from the frontdue to other critical care equipment being in the way.The attending physician stated that the intended dos-age was clearly written in the record as being10 cm3/h.

A.2. Scenario two: oxygen

A pediatric ICU physician was treating a six-month-old patient with oxygen and ordered that the infant re-ceive 1.5 L/min. Within 3 min, the patient became hyp-oxic. At this point, the critical care team increasedoxygen flow to 3 L/min for 10 min to compensate andwas then ordered by the physician to be set to 2 L/min. Biomedical engineering told the critical care teamthat they set the flow control knob between 1 and 2 L/min not realizing that the scale numbers represented dis-crete settings (0 or 1 or 2 or 3 etc.) rather than continu-ous settings (1.1, 1.2, 1.3, 1.4, etc.). Hence, even thoughthe knob rotated smoothly—suggesting that intermedi-ate settings were possible—there was no oxygen flow be-tween the settings.

A.3. Scenario three: heparin

Around 8 pm, a patient was receiving a usual infu-sion of heparin per the heparin protocol, around12 cm3/min (1200 U/h). Due to a change in circum-stances, this patient was to receive a bolus dose of Hep-arin IV through the infusion pump. This bolus wasordered to change at the end of one hour back to themaintenance continuous dose. A physician at the endof the hour changed the drip rate but did not checkthe confirm button or notice the small confirm printwarning on the panel of the pump. The next morning,a nurse entered the room upon hearing the pump alarmbeeping. She noted an empty bag, and that the rate seton the pump was 200 cm3/min. The patient had re-ceived a bolus of approximately 18,000 U of heparin.When the nurse manager investigated the event, bothnurses who were caring for this patient overnight de-nied changing the pump infusion at all. Of note, therewas a patient�s family member in the room the entirenight, though this person was never asked directlyabout what might have happened to the pump. Fur-thermore, the patient did not recall anyone changingthe pump. Biomedical Engineering now has the case,and is investigating the pump. Preliminary report sug-gests that there was no problem with the pump but thatit was misprogrammed.

Table 1Final error categories, specific errors, involved individuals with respect to their modifiability and relation to the medical devices in the scenarios

Error category Error identified Individual identified Modifiablity Relationto device

Administration Education and training System/administrators Partial NoAdministration Device choice ICU administrator Yes PartialAdministration Education and training on device System/administrators/clinical care team Partial PartialAdministration Device removal ICU administrator Yes PartialAdministration Inservicing Vendor Yes Partial

Device Device design: inbuilt constraints/checks Device designers Yes YesDevice Device design: lack of logic for unit conversion Device designers Yes YesDevice Device design: standardized input Device designers/administrators Yes YesDevice Device design: generic Device designers Yes YesDevice Device design: inbuilt alarms, indicators for no flow Device designers Yes YesDevice Device design: availability of continuous settings Device designers Yes YesDevice Device design: interface Device designers Yes YesPolicy Device access by appropriate user Respiratory therapist Yes PartialDevice Device design: visual display/interface Device designers Yes YesDevice Device design: louder alarm Device designers Yes YesDevice Device design: inbuilt alarm trigger Device designers Yes YesDevice Device design: programming/instant alarm Device designers Yes YesDevice Device design: alarm location/wrong setting Device designers Yes YesDevice Device design: poor interface Device designers Yes YesDevice Device design: poor visibility/flow display Device designers Yes Yes

Situation Environmental stress/task overload Medical team No No

Policy Double checking Medical team and physician Partial NoPolicy Double checking and documentation System/Administrators Partial NoPolicy Lack of standards/protocols System/administrators Yes PartialPolicy Order format: implementation protocol System/administrators Yes PartialPolicy Order format: mode Physician Yes PartialPolicy Order format: standardization System/protocol Yes PartialPolicy Order format: units Physician Yes PartialPolicy Streamline work flow, clear orders System protocols Partial NoPolicy Order format: delivery Physician Yes PartialPolicy Inservicing Biomedical Engineer Yes PartialPolicy Miscommunication Biomedical engineer and critical care tea Partial NoPolicy Labeling and inservicing System/administrators Yes PartialPolicy Protocol for actions System/administrators Yes NoPolicy Changing shifts: new users System/administrators No NoPolicy Inservicing and device evaluation Biomedical engineers Yes Partial

User User error: generic Medical team Partial NoUser User error: carelessness Nurse Partial NoUser User error: mislabeling pump Nurse Partial PartialUser User error: order interpretation Nurse Partial NoUser User error: inattention Device operator Partial PartialUser User error: bad use of device Device operator Partial PartialUser User error: deviation from protocol Medical team Partial NoUser User error: inappropriate user Device operator Yes PartialUser User error: action Nurse Partial PartialUser User error: lack of Knowledge Nurse Partial NoUser User error: documentation of changes Physician Partial NoUser User error: automatic habit, previous experience on pum Physician Partial PartialUser User error: alarm beeping not checked Nurse Partial PartialUser User error: failure to monitor patient Nurse Partial NoUser User error: action (confirm button) Physician Partial Partial

Setup Environment/floor setup/design improvement ICU administrator Partial PartialSetup Environment/ICU setup: device positioning ICU administrator Yes PartialSetup Proximity to beeping device Nurse Yes Partial

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 47

Table 2Nitroglycerine scenario, responses in which partial modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Education and training System/administrators NoUser error: generic Medical team NoUser error: carelessness Nurse NoUser error: mislabeling pump Nurse PartialUser error: order interpretation Nurse NoUser error: bad use of device Device operator PartialUser error: lack of knowledge Medical Team NoDouble checking Medical team and physician NoDouble checking and documentation System/administrators NoStreamline work flow, clear orders System protocols NoMiscommunication Biomedical engineer and critical care team NoEnvironment/floor setup/design improvement ICU administrator Partial

Table 5Oxygen scenario—responses in which partial modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Education and training on device System/administrators/clinical care team PartialUser error: generic Medical team NoUser error: lack of knowledge Medical team NoDouble checking Medical team and physician NoMiscommunication Biomedical engineer and critical care team No

Table 4Nitorglycerine scenario—responses in which no modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Environmental Stress/task overload Medical team No

Table 3Nitroglycerine scenario—responses in which modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Device choice ICU administrator PartialDevice design: inbuilt constraints/checks Device designers YesDevice design: lack of logic for unit conversion Device designers YesDevice design: programming standardized input Device designers/administrators YesDevice design: standardized input Device designers YesDevice design: generic Device designers YesLack of standards/protocols System/administrators PartialOrder format: implementation protocol System/administrators PartialOrder format: mode Physician PartialOrder format: standardization System/protocol PartialOrder format: units Physician PartialOrder format: delivery Physician PartialEnvironment/ICU setup: device positioning ICU administrator Partial

Table 6Oxygen scenario—responses in which modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Device choice ICU administrator PartialDevice removal ICU administrator PartialDevice design: generic Device designers YesDevice design: inbuilt alarms, indicators for no flow Device designers YesDevice design: availability of continuous settings Device designers YesDevice design: interface Device designers YesDevice design: appropriate user access Device designers YesUser error: inappropriate user Device operator PartialOrder format: delivery Physician PartialInservicing Biomedical engineer PartialLabeling and inservicing System/administrators Partial

48 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

Table 8Heparin scenario—responses in which modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Device design: standardized input Device designers YesDevice design: interface Device designers YesDevice access by appropriate user Respiratory therapist PartialDevice design: visual display/interface Device designers YesDevice design: louder alarm Device designers YesDevice design: inbuilt alarm trigger Device designers YesDevice design: programming/instant alarm Device designers YesDevice design: appropriate user access Device designers YesDevice design: alarm location/wrong setting Device designers YesDevice design: poor visibility/flow display Device designers YesUser error: inappropriate user Device operator PartialLack of standards/protocols System/administrators PartialOrder format: delivery Physician PartialInservicing and device evaluation Biomedical engineers PartialProximity to beeping device Nurse Partial

Table 7Heparin scenario—responses in which partially- modifiable error categories/factors identified

Error identified (broad category) Player identified Relation to device

Education and training on device System/administrators/clinical care team PartialUser error: generic Medical team NoUser error: carelessness Nurse NoUser error: inattention Device operator PartialUser error: bad use of device Device operator PartialUser error: deviation from protocol Medical team NoUser error: action Nurse PartialUser error: lack of knowledge Medical Team NoUser error: documentation of changes Physician NoUser error: automatic habit, previous experience on pump Physician PartialUser error: alarm beeping not checked Nurse PartialUser error: failure to monitor patient Nurse NoDouble checking Medical Team and Physician NoMiscommunication Biomedical engineer and critical care team No

Table 9Heparin scenario—responses in which no modifiable error categories/factors were identified

Error identified (broad category) Player identified Direct relation to device

Changing shifts: new users System/administrators NoEnvironmental stress/task overload Medical team No

S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50 49

References

[1] Cook RI, Woods DD. Operating at the sharp end the complexityof medical error. Chapter 13, Human Error in Medicine. Hills-dale, NJ: Lawrence Erlbaum; 1994.

[2] Bates DW, Cohen M, Leape LL, Overhage JM, Shabot MM,Sheridan T. Reducing the frequency 609 of errors in medicineusing information technology. J Am Med Inform Assoc2001;8:299–308.

[3] Bria WF, Shabot MM. The electronic medical record, safety, andcritical care. Critical Care in Clinics (2005) [Article in press].

[4] Ash JS, Lyman JA, Carpenter J, Fournier L. A diffusion ofinnovations model of physician order entry. In: Proceedings of theAMIA annual symposium; 2001. p. 22–6.

[5] Patel VL, Cytryn KN, Shortlife EH, Safran C. The groupcollaborative effort health care team: role of individual and groupexpertise. Teach Learn Med 2000;12:117–32.

[6] Orasanu J, Salas E. Team decision making in complex environ-ments. In: Klein HJ, Orasanu J, Calderwood R, Asambok CE,editors. Decision making in action: models and methods. Nor-wood, NJ: Ablex; 1993. p. 327–45.

[7] Patel VL, Arocha JF. The nature of constraints on collaborativedecision making in health care settings. In: Salas E, Klein G,editors. Linking expertise and naturalistic decision making. Mah-awah, NJ: Lawrence Erlbaum; 2000. p. 383–485.

[8] Leape LL. A systems analysis approach to medical error. J EvalClin Pract 1997;3:213–22.

[9] Leape LL, Bates DW, Cullen DJ, et al. Systems analysis ofadverse drug events. JAMA 1995;274(1):35–43.

[10] Brennan TA, Leape LL, Laird NM, et al. Incidence of adverseevents and negligence in hospitalized patients: results of theHarvard Medical Practice Study I. N Engl J Med 1991;324:370–6.

[11] Bates DW, Spell N, Cullen DJ, et al. The costs of adverse drugevents in hospitalized patients. JAMA 1997;277(4):307–11.

50 S. Malhotra et al. / Journal of Biomedical Informatics 38 (2005) 34–50

[12] Leape LL, Brennan TA, Laird N, et al. The nature of adverseevents in hospitalized patients: results of the Harvard MedicalPractice Study II. N Engl J Med 1991;324:377–84.

[13] BatesD,CullenDJ,LairdN, et al. Incidenceof adverse drug eventsand potential adverse drug events. JAMA 1995;274(1):29–34.

[14] Institute of Medicine. To Err is Human: Building a Safer HealthSystem. Washington, DC: National Academy Press; 1999.

[15] Kaganov AL. Ann Thorac Surg. Medical device development:innovation versus regulation. April; 29(4):331–5.

[16] Cross reference to ISO 9001:1994 and ISO/DIS 13485 section4.4.4 Design input.

[17] Fleck PD. Five insights for improving product development cyclesuccess. Humanizing technology; Cooper interaction design.

[18] Design control guidance for medical device manufacturers thisguidance relates to FDA 21 CFR 820.30 and sub-clause 4.4 ofISO 9001 March 11, 1997.

[19] Tufts center for the study of drug development. Impact report:analysis and insight into critical drug development issues, vol. 2.March 2000.

[20] Food and Drug Administration. Human Factors Implications ofthe New GMP Rule. Overall Requirements of the New QualitySystem Regulations. Available from: http://www.fda.gov/cdrh/humfac/hufacimp.html [2000, June 27]; 1998..

[21] Sawyer D, Aziz KJ, Backinger CL, Beers ET, Lowery A, SykesSM, et al. Do it by Design: an Introduction to Human Factors inMedical Devices. US Department of Health and Human Services,Public Health Service, Food and Drug Administration, Center forDevices and Radiological Health; 1996.

[22] Dion R, ‘‘Process Improvement and the Corporate BalanceSheet,’’ IEEE Software, July 1993. p. 2835.

[23] Patel VL, Arocha JF, Kaufman DR. A primer on aspects ofcognition for medical informatics. J Am Med Inform Assoc2001;8:324–43.

[24] Patel VL, DSC, Kushniruk AW, Yang S, Yale JF. Impact of acomputer-based patient record system on data collection, knowl-edge organization, and reasoning. JAMIA, vol. 7. Number 6 Nov/Dec 2000 p. 569–85.

[25] Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Usingusability heuristics to evaluate patient safety of medical devices. JBiomed Inform 2003;36(1–2):23–30.

[26] Patel VL, Kaufman DR. Cognitive science and biomedicalinformatics. In: Shortliffe EH, Cimino JJ, editors. Biomedicalinformatics: computer applications in health care and biomedi-cine. New York: Springer-Verlag; 2004.

[27] Kushniruk AW, Patel VL. Cognitive and usability engineeringmethods for the evaluation of clinical information systems. JBiomed Inform 2004;37(1):56–76.

[28] Preece J, Rogers Y, Sharp H, Benyon D, Holland S, Carey T.Human–computer interaction. New York: Addison-Wesley;1994.

[29] Vicente KJ. Cognitive work analysis: toward safe, productive andhealthy computer-based work. New York: Lawrence Erlbaum;1999.

[30] Rasmussen J. On information processing and human–machineinteraction: an approach to cognitive engineering. Amster-dam: Elsevier; 1986.

[31] Patel VL, Kaufman DA, Arocha JF. Emerging paradigms ofcognition and medical decision making. J Biomed Inform2002;35:52–75.

[32] Beuscart-Zepir MC, Bender J, Beuscart R, Menager-Depriester I.Cognitive evaluation: how to assess the usability of informationtechnology in healthcare. Comput Methods Programs Biomed1997;54(1–1):19–28.

[33] Zhang J, Patel VL, Johnson KA, Smith JW, Malin J. Designinghuman centered distributed information systems. IEEE Intell Syst2002:42–7.

[34] Do It By Design-An Introduction to Human Factors in MedicalDevices (Dick Sawyer et al, 3).

[35] Patel VL, Bates DW. Cognition and measurement in patientsafety research. J Biomed Informatics 2003;36(1–2):1–3.

[36] Keselman A, Patel VL, Johnson TR, Zhang J. Institutionaldecision-making to select patient care devices: identifying venuesto promote patient safety. J Biomed Inform 2003;36(1–2):31–44.

[37] Miles MB, Huberman AM. Qualitative data analysis: anexpanded sourcebook. Thousand Oaks, CA: Sage; 1994.

[38] StraussA, Corbin J.Grounded theorymethodology—anoverview.In: Denzin NK, Lincoln YS, editors. Handbook of qualitativeresearch. Thousand Oaks: Sage Publications; 1994. p. 273–85.

[39] Bogdan RC, Biklen SK. Qualitative research for educa-tion. Needham Heights, MA: Allyn & Bacon; 1992.

[40] Laxmisan A, Malhotra S, Keselman A, Johnson TR, Patel, VL.Decisions about Critical Events in Device-Related Scenarios as aFunction of Expertise. Prepublication Manuscript for J BiomedInform; July 2004.

[41] Chung PH, Zhang J, Johnson TR, Patel VL. An extendedhierarchical task analysis for error prediction in medical devices.Proc AMIA Symp 2003:165–9.

[42] Wholey MH, Haller JD. An introduction to the Food andDrug Administration and how it evaluates new devices:establishing safety and efficacy. Jt Comm J Qual Saf2003;29(11):598–609.

[43] Horsky J, Kaufman DR, Oppenheim MI, Patel VL. A frameworkfor analyzing the cognitive complexity of computer-assistedclinical ordering. J Biomed Inform 2003;36:4–22.

[44] Gillan, DJ, Schvaneveldt RW. Applying cognitive psychology:Bridging the gulf between basic research and cognitive artifacts. In:Durso FT, editor. Handbook of applied cognition; 1999. p. 3–31.

[45] Norman DA. Things that make us smart: defending humanattributes in the age of the machine. Reading, MA: Addison-Wesley; 1993.