humanfactors_classAnly

19
DOT/FAA/AM-00/7 U.S. Department of Transportation Federal Aviation Administration Scott A. Shappell FAA Civil Aeromedical Institute Oklahoma City, OK 73125 Douglas A. Wiegmann University of Illinois at Urbana-Champaign Institute of Aviation Savoy, IL 61874 February 2000 Final Report This document is available to the public through the National Technical Information Service, Springfield, Virginia 22161. Office of Aviation Medicine Washington, DC 20591 The Human Factors Analysis and Classification System–HFACS

Transcript of humanfactors_classAnly

Page 1: humanfactors_classAnly

DOT/FAA/AM-00/7

U.S. Depar tmentof Transpor tation

Federal AviationAdministration

Scott A. ShappellFAA Civil Aeromedical InstituteOklahoma City, OK 73125

Douglas A. WiegmannUniversity of Illinois at Urbana-ChampaignInstitute of AviationSavoy, IL 61874

February 2000

Final Report

This document is available to the publicthrough the National Technical InformationService, Springfield, Virginia 22161.

Office of Aviation MedicineWashington, DC 20591

The Human FactorsAnalysis and ClassificationSystem–HFACS

Page 2: humanfactors_classAnly

N O T I C E

This document is disseminated under the sponsorship ofthe U.S. Department of Transportation in the interest of

information exchange. The United States Governmentassumes no liability for the contents thereof.

Page 3: humanfactors_classAnly

i

Technical Report Documentation Page

1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.

DOT/FAA/AM-00/74. Title and Subtitle 5. Report Date

The Human Factors Analysis and Classification System—HFACS February 20006. Performing Organization Code

7. Author(s) 8. Performing Organization Report No.

Shappell, S.A.1, and Wiegmann, D.A.2

9. Performing Organization Name and Address 10. Work Unit No. (TRAIS)

1FAA Civil Aeromedical Institute, Oklahoma City, OK 731252University of Illinois at Urbana-Champaign, Institute of Aviation,Savoy, Ill. 61874 11. Contract or Grant No.

99-G-006

12. Sponsoring Agency name and Address 13. Type of Report and Period Covered

Office of Aviation MedicineFederal Aviation Administration 14. Sponsoring Agency Code

800 Independence Ave., S.W.Washington, DC 2059115. Supplemental Notes

This work was performed under task # AAM-A –00-HRR-52016. Abstract

Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accidentreporting systems are not designed around any theoretical framework of human error. As a result, mostaccident databases are not conducive to a traditional human error analysis, making the identification ofintervention strategies onerous. What is required is a general human error framework around which newinvestigative methods can be designed and existing accident databases restructured. Indeed, a comprehensivehuman factors analysis and classification system (HFACS) has recently been developed to meet those needs.Specifically, the HFACS framework has been used within the military, commercial, and general aviation sectorsto systematically examine underlying human causal factors and to improve aviation accident investigations.This paper describes the development and theoretical underpinnings of HFACS in the hope that it will helpsafety professionals reduce the aviation accident rate through systematic, data-driven investment strategies andobjective evaluation of intervention programs

17. Key Words 18. Distribution Statement

Aviation, Human Error, Accident Investigation, DatabaseAnalysis

Document is available to the public through theNational Technical Information Service,Springfield, Virginia 22161

19. Security Classif. (of this report) 20. Security Classif. (of this page) 21. No. of Pages 22. Price

Unclassified Unclassified 18Form DOT F 1700.7 (8-72) Reproduction of completed page authorized

Page 4: humanfactors_classAnly
Page 5: humanfactors_classAnly

1

THE HUMAN FACTORS ANALYSIS AND CLASSIFICATION SYSTEM–HFACS

INTRODUCTION

Sadly, the annals of aviation history are littered withaccidents and tragic losses. Since the late 1950s, how-ever, the drive to reduce the accident rate has yieldedunprecedented levels of safety to a point where it is nowsafer to fly in a commercial airliner than to drive a car oreven walk across a busy New York city street. Still, whilethe aviation accident rate has declined tremendouslysince the first flights nearly a century ago, the cost ofaviation accidents in both lives and dollars has steadilyrisen. As a result, the effort to reduce the accident ratestill further has taken on new meaning within bothmilitary and civilian aviation.

Even with all the innovations and improvementsrealized in the last several decades, one fundamentalquestion remains generally unanswered: “Why do air-craft crash?” The answer may not be as straightforwardas one might think. In the early years of aviation, it couldreasonably be said that, more often than not, the aircraftkilled the pilot. That is, the aircraft were intrinsicallyunforgiving and, relative to their modern counterparts,mechanically unsafe. However, the modern era of avia-tion has witnessed an ironic reversal of sorts. It nowappears to some that the aircrew themselves are moredeadly than the aircraft they fly (Mason, 1993; cited inMurray, 1997). In fact, estimates in the literature indi-cate that between 70 and 80 percent of aviation acci-dents can be attributed, at least in part, to human error(Shappell & Wiegmann, 1996). Still, to off-handedlyattribute accidents solely to aircrew error is like tellingpatients they are simply “sick” without examining theunderlying causes or further defining the illness.

So what really constitutes that 70-80 % of humanerror repeatedly referred to in the literature? Somewould have us believe that human error and “pilot” errorare synonymous. Yet, simply writing off aviation acci-dents merely to pilot error is an overly simplistic, if notnaive, approach to accident causation. After all, it iswell established that accidents cannot be attributedto a single cause, or in most instances, even a singleindividual (Heinrich, Petersen, and Roos, 1980). In

fact, even the identification of a “primary” cause isfraught with problems. Rather, aviation accidents arethe end result of a number of causes, only the last ofwhich are the unsafe acts of the aircrew (Reason, 1990;Shappell & Wiegmann, 1997a; Heinrich, Peterson, &Roos, 1980; Bird, 1974).

The challenge for accident investigators and analystsalike is how best to identify and mitigate the causalsequence of events, in particular that 70-80 % associ-ated with human error. Armed with this challenge, thoseinterested in accident causation are left with a growinglist of investigative schemes to chose from. In fact, thereare nearly as many approaches to accident causation asthere are those involved in the process (Senders &Moray, 1991). Nevertheless, a comprehensive frame-work for identifying and analyzing human error contin-ues to elude safety professionals and theorists alike.Consequently, interventions cannot be accurately tar-geted at specific human causal factors nor can theireffectiveness be objectively measured and assessed. In-stead, safety professionals are left with the status quo.That is, they are left with interest/fad-driven researchresulting in intervention strategies that peck around theedges of accident causation, but do little to reduce theoverall accident rate. What is needed is a frameworkaround which a needs-based, data-driven safety pro-gram can be developed (Wiegmann & Shappell, 1997).

Reason’s “Swiss Cheese” Model of Human Error

One particularly appealing approach to the genesis ofhuman error is the one proposed by James Reason(1990). Generally referred to as the “Swiss cheese”model of human error, Reason describes four levels ofhuman failure, each influencing the next (Figure 1).Working backwards in time from the accident, the firstlevel depicts those Unsafe Acts of Operators that ulti-mately led to the accident1. More commonly referred toin aviation as aircrew/pilot error, this level is where mostaccident investigations have focused their efforts andconsequently, where most causal factors are uncovered.

1 Reason’s original work involved operators of a nuclear power plant. However, for the purposes of this manuscript, theoperators here refer to aircrew, maintainers, supervisors and other humans involved in aviation.

Page 6: humanfactors_classAnly

2

After all, it is typically the actions or inactions of aircrewthat are directly linked to the accident. For instance,failing to properly scan the aircraft’s instruments whilein instrument meteorological conditions (IMC) or pen-etrating IMC when authorized only for visual meteoro-logical conditions (VMC) may yield relativelyimmediate, and potentially grave, consequences. Repre-sented as “holes” in the cheese, these active failures aretypically the last unsafe acts committed by aircrew.

However, what makes the “Swiss cheese” modelparticularly useful in accident investigation, is that itforces investigators to address latent failures within thecausal sequence of events as well. As their name suggests,latent failures, unlike their active counterparts, may liedormant or undetected for hours, days, weeks, or evenlonger, until one day they adversely affect the unsuspect-ing aircrew. Consequently, they may be overlooked byinvestigators with even the best intentions.

Within this concept of latent failures, Reason de-scribed three more levels of human failure. The firstinvolves the condition of the aircrew as it affects perfor-mance. Referred to as Preconditions for Unsafe Acts, thislevel involves conditions such as mental fatigue andpoor communication and coordination practices, oftenreferred to as crew resource management (CRM). Notsurprising, if fatigued aircrew fail to communicate andcoordinate their activities with others in the cockpit orindividuals external to the aircraft (e.g., air traffic con-trol, maintenance, etc.), poor decisions are made anderrors often result.

But exactly why did communication and coordi-nation break down in the first place? This is perhapswhere Reason’s work departed from more traditionalapproaches to human error. In many instances, thebreakdown in good CRM practices can be tracedback to instances of Unsafe Supervision, the third levelof human failure. If, for example, two inexperienced(and perhaps even below average pilots) are pairedwith each other and sent on a flight into knownadverse weather at night, is anyone really surprised bya tragic outcome? To make matters worse, if thisquestionable manning practice is coupled with thelack of quality CRM training, the potential for mis-communication and ultimately, aircrew errors, ismagnified. In a sense then, the crew was “set up” forfailure as crew coordination and ultimately perfor-mance would be compromised. This is not to lessen therole played by the aircrew, only that intervention andmitigation strategies might lie higher within the system.

Reason’s model didn’t stop at the supervisory leveleither; the organization itself can impact perfor-mance at all levels. For instance, in times of fiscalausterity, funding is often cut, and as a result, train-ing and flight time are curtailed. Consequently, su-pervisors are often left with no alternative but to task“non-proficient” aviators with complex tasks. Notsurprisingly then, in the absence of good CRM train-ing, communication and coordination failures willbegin to appear as will a myriad of other precondi-tions, all of which will affect performance and elicitaircrew errors. Therefore, it makes sense that, if theaccident rate is going to be reduced beyond currentlevels, investigators and analysts alike must examinethe accident sequence in its entirety and expand itbeyond the cockpit. Ultimately, causal factors at alllevels within the organization must be addressed ifany accident investigation and prevention system isgoing to succeed.

In many ways, Reason’s “Swiss cheese” model ofaccident causation has revolutionized common viewsof accident causation. Unfortunately, however, it issimply a theory with few details on how to apply it ina real-world setting. In other words, the theory neverdefines what the “holes in the cheese” really are, atleast within the context of everyday operations. Ulti-mately, one needs to know what these system failuresor “holes” are, so that they can be identified duringaccident investigations or better yet, detected andcorrected before an accident occurs.

Mishap

Latent Failures

Latent Failures

Latent Failures

Active Failures

Failed orAbsent Defenses

OrganizationalInfluences

UnsafeSupervision

Preconditionsfor

Unsafe Acts

UnsafeActs

Figure 1. The “Swiss cheese” model of humanerror causation (adapted from Reason, 1990).

Page 7: humanfactors_classAnly

3

The balance of this paper will attempt to describethe “holes in the cheese.” However, rather than at-tempt to define the holes using esoteric theories withlittle or no practical applicability, the original frame-work (called the Taxonomy of Unsafe Operations) wasdeveloped using over 300 Naval aviation accidentsobtained from the U.S. Naval Safety Center (Shappell& Wiegmann, 1997a). The original taxonomy hassince been refined using input and data from othermilitary (U.S. Army Safety Center and the U.S. AirForce Safety Center) and civilian organizations (Na-tional Transportation Safety Board and the FederalAviation Administration). The result was the devel-opment of the Human Factors Analysis and Classifi-cation System (HFACS).

THE HUMAN FACTORS ANALYSIS ANDCLASSIFICATION SYSTEM

Drawing upon Reason’s (1990) concept of latentand active failures, HFACS describes four levels offailure: 1) Unsafe Acts, 2) Preconditions for UnsafeActs, 3) Unsafe Supervision, and 4) OrganizationalInfluences. A brief description of the major compo-nents and causal categories follows, beginning with thelevel most closely tied to the accident, i.e. unsafe acts.

Unsafe Acts

The unsafe acts of aircrew can be loosely classifiedinto two categories: errors and violations (Reason,1990). In general, errors represent the mental orphysical activities of individuals that fail to achieve

their intended outcome. Not surprising, given thefact that human beings by their very nature makeerrors, these unsafe acts dominate most accidentdatabases. Violations, on the other hand, refer to thewillful disregard for the rules and regulations thatgovern the safety of flight. The bane of many organi-zations, the prediction and prevention of these ap-palling and purely “preventable” unsafe acts, continueto elude managers and researchers alike.

Still, distinguishing between errors and violationsdoes not provide the level of granularity required ofmost accident investigations. Therefore, the catego-ries of errors and violations were expanded here(Figure 2), as elsewhere (Reason, 1990; Rasmussen,1982), to include three basic error types (skill-based,decision, and perceptual) and two forms of violations(routine and exceptional).

Errors

Skill-based errors. Skill-based behavior within thecontext of aviation is best described as “stick-and-rudder” and other basic flight skills that occur with-out significant conscious thought. As a result, theseskill-based actions are particularly vulnerable to fail-ures of attention and/or memory. In fact, attentionfailures have been linked to many skill-based errorssuch as the breakdown in visual scan patterns, taskfixation, the inadvertent activation of controls, andthe misordering of steps in a procedure, among others(Table 1). A classic example is an aircraft’s crew thatbecomes so fixated on trouble-shooting a burned outwarning light that they do not notice their fatal

PerceptualErrors

DecisionErrors

Skill-BasedErrors

Errors

UNSAFEACTS

ExceptionalRoutine

Violations

Figure 2. Categories of unsafe acts committed by aircrews.

Page 8: humanfactors_classAnly

4

TABLE 1 . Selected examples of Unsafe Acts of Pilot Operators (Note: This is nota complete listing)

ERRORSSkill-based Errors

Breakdown in visual scan

Failed to prioritize attention

Inadvertent use of flight controls

Omitted step in procedure

Omitted checklist item

Poor technique

Over-controlled the aircraft

Decision ErrorsImproper procedure

Misdiagnosed emergency

Wrong response to emergency

Exceeded ability

Inappropriate maneuver

Poor decision

Perceptual Errors (due to)Misjudged distance/altitude/airspeed

Spatial disorientation

Visual illusion

VIOLATIONSFailed to adhere to brief

Failed to use the radar altimeter

Flew an unauthorized approach

Violated training rules

Flew an overaggressive maneuver

Failed to properly prepare for the flight

Briefed unauthorized flight

Not current/qualified for the mission

Intentionally exceeded the limits of the aircraft

Continued low-altitude flight in VMC

Unauthorized low-altitude canyon running

descent into the terrain. Perhaps a bit closer to home,consider the hapless soul who locks himself out of thecar or misses his exit because he was either distracted,in a hurry, or daydreaming. These are both examplesof attention failures that commonly occur duringhighly automatized behavior. Unfortunately, whileat home or driving around town these attention/memory failures may be frustrating, in the air theycan become catastrophic.

In contrast to attention failures, memory failuresoften appear as omitted items in a checklist, placelosing, or forgotten intentions. For example, most ofus have experienced going to the refrigerator only toforget what we went for. Likewise, it is not difficultto imagine that when under stress during inflightemergencies, critical steps in emergency procedurescan be missed. However, even when not particularlystressed, individuals have forgotten to set the flaps onapproach or lower the landing gear – at a minimum,an embarrassing gaffe.

The third, and final, type of skill-based errorsidentified in many accident investigations involvestechnique errors. Regardless of one’s training,

experience, and educational background, the mannerin which one carries out a specific sequence of eventsmay vary greatly. That is, two pilots with identicaltraining, flight grades, and experience may differsignificantly in the manner in which they maneuvertheir aircraft. While one pilot may fly smoothly withthe grace of a soaring eagle, others may fly with thedarting, rough transitions of a sparrow. Nevertheless,while both may be safe and equally adept at flying, thetechniques they employ could set them up for specificfailure modes. In fact, such techniques are as much afactor of innate ability and aptitude as they are anovert expression of one’s own personality, makingefforts at the prevention and mitigation of techniqueerrors difficult, at best.

Decision errors. The second error form, decisionerrors, represents intentional behavior that proceedsas intended, yet the plan proves inadequate or inap-propriate for the situation. Often referred to as “hon-est mistakes,” these unsafe acts represent the actionsor inactions of individuals whose “hearts are in theright place,” but they either did not have the appro-priate knowledge or just simply chose poorly.

Page 9: humanfactors_classAnly

5

Perhaps the most heavily investigated of all errorforms, decision errors can be grouped into threegeneral categories: procedural errors, poor choices,and problem solving errors (Table 1). Proceduraldecision errors (Orasanu, 1993), or rule-based mis-takes, as described by Rasmussen (1982), occur dur-ing highly structured tasks of the sorts, if X, then doY. Aviation, particularly within the military andcommercial sectors, by its very nature is highly struc-tured, and consequently, much of pilot decisionmaking is procedural. There are very explicit proce-dures to be performed at virtually all phases of flight.Still, errors can, and often do, occur when a situationis either not recognized or misdiagnosed, and thewrong procedure is applied. This is particularly truewhen pilots are placed in highly time-critical emer-gencies like an engine malfunction on takeoff.

However, even in aviation, not all situations havecorresponding procedures to deal with them. There-fore, many situations require a choice to be madeamong multiple response options. Consider the pilotflying home after a long week away from the familywho unexpectedly confronts a line of thunderstormsdirectly in his path. He can choose to fly around theweather, divert to another field until the weatherpasses, or penetrate the weather hoping to quicklytransition through it. Confronted with situationssuch as this, choice decision errors (Orasanu, 1993),or knowledge-based mistakes as they are otherwiseknown (Rasmussen, 1986), may occur. This is par-ticularly true when there is insufficient experience,time, or other outside pressures that may precludecorrect decisions. Put simply, sometimes we chosewell, and sometimes we don’t.

Finally, there are occasions when a problem is notwell understood, and formal procedures and responseoptions are not available. It is during these ill-definedsituations that the invention of a novel solution isrequired. In a sense, individuals find themselveswhere no one has been before, and in many ways,must literally fly by the seats of their pants. Individu-als placed in this situation must resort to slow andeffortful reasoning processes where time is a luxuryrarely afforded. Not surprisingly, while this type ofdecision making is more infrequent then other forms,the relative proportion of problem-solving errorscommitted is markedly higher.

Perceptual errors. Not unexpectedly, when one’sperception of the world differs from reality, errorscan, and often do, occur. Typically, perceptual errorsoccur when sensory input is degraded or “unusual,”as is the case with visual illusions and spatial disori-entation or when aircrew simply misjudge the aircraft’saltitude, attitude, or airspeed (Table 1). Visual illu-sions, for example, occur when the brain tries to “fillin the gaps” with what it feels belongs in a visuallyimpoverished environment, like that seen at night orwhen flying in adverse weather. Likewise, spatialdisorientation occurs when the vestibular systemcannot resolve one’s orientation in space and there-fore makes a “best guess” — typically when visual(horizon) cues are absent at night or when flying inadverse weather. In either event, the unsuspectingindividual often is left to make a decision that is basedon faulty information and the potential for commit-ting an error is elevated.

It is important to note, however, that it is not theillusion or disorientation that is classified as a percep-tual error. Rather, it is the pilot’s erroneous responseto the illusion or disorientation. For example, manyunsuspecting pilots have experienced “black-hole”approaches, only to fly a perfectly good aircraft intothe terrain or water. This continues to occur, eventhough it is well known that flying at night over dark,featureless terrain (e.g., a lake or field devoid of trees),will produce the illusion that the aircraft is actuallyhigher than it is. As a result, pilots are taught to relyon their primary instruments, rather than the outsideworld, particularly during the approach phase offlight. Even so, some pilots fail to monitor theirinstruments when flying at night. Tragically, theseaircrew and others who have been fooled by illusionsand other disorientating flight regimes may end upinvolved in a fatal aircraft accident.

ViolationsBy definition, errors occur within the rules and

regulations espoused by an organization; typicallydominating most accident databases. In contrast,violations represent a willful disregard for the rulesand regulations that govern safe flight and, fortu-nately, occur much less frequently since they ofteninvolve fatalities (Shappell et al., 1999b).

Page 10: humanfactors_classAnly

6

While there are many ways to distinguish betweentypes of violations, two distinct forms have been iden-tified, based on their etiology, that will help the safetyprofessional when identifying accident causal factors.The first, routine violations, tend to be habitual bynature and often tolerated by governing authority (Rea-son, 1990). Consider, for example, the individual whodrives consistently 5-10 mph faster than allowed by lawor someone who routinely flies in marginal weatherwhen authorized for visual meteorological conditionsonly. While both are certainly against the governingregulations, many others do the same thing. Further-more, individuals who drive 64 mph in a 55 mph zone,almost always drive 64 in a 55 mph zone. That is, they“routinely” violate the speed limit. The same can typi-cally be said of the pilot who routinely flies into mar-ginal weather.

What makes matters worse, these violations (com-monly referred to as “bending” the rules) are oftentolerated and, in effect, sanctioned by supervisory au-thority (i.e., you’re not likely to get a traffic citationuntil you exceed the posted speed limit by more than 10mph). If, however, the local authorities started handingout traffic citations for exceeding the speed limit on thehighway by 9 mph or less (as is often done on militaryinstallations), then it is less likely that individuals wouldviolate the rules. Therefore, by definition, if a routineviolation is identified, one must look further up thesupervisory chain to identify those individuals in au-thority who are not enforcing the rules.

On the other hand, unlike routine violations, excep-tional violations appear as isolated departures fromauthority, not necessarily indicative of individual’s typi-cal behavior pattern nor condoned by management

(Reason, 1990). For example, an isolated instance ofdriving 105 mph in a 55 mph zone is considered anexceptional violation. Likewise, flying under a bridge orengaging in other prohibited maneuvers, like low-levelcanyon running, would constitute an exceptional viola-tion. However, it is important to note that, while mostexceptional violations are appalling, they are not consid-ered “exceptional” because of their extreme nature.Rather, they are considered exceptional because they areneither typical of the individual nor condoned by au-thority. Still, what makes exceptional violations par-ticularly difficult for any organization to deal with isthat they are not indicative of an individual’s behavioralrepertoire and, as such, are particularly difficult topredict. In fact, when individuals are confronted withevidence of their dreadful behavior and asked toexplain it, they are often left with little explanation.Indeed, those individuals who survived such excur-sions from the norm clearly knew that, if caught, direconsequences would follow. Still, defying all logic,many otherwise model citizens have been down thispotentially tragic road.

Preconditions for Unsafe Acts

Arguably, the unsafe acts of pilots can be directlylinked to nearly 80 % of all aviation accidents. However,simply focusing on unsafe acts is like focusing on a feverwithout understanding the underlying disease causingit. Thus, investigators must dig deeper into why theunsafe acts took place. As a first step, two major subdi-visions of unsafe aircrew conditions were developed:substandard conditions of operators and the substan-dard practices they commit (Figure 3).

PRECONDITIONSFOR

UNSAFE ACTS

SubstandardConditions of

Operators

AdversePhysiological

States

Physical/Mental

Limitations

AdverseMentalStates

PersonalReadiness

Crew ResourceMismanagement

SubstandardPractices ofOperators

Figure 3. Categories of preconditions of unsafe acts.

Page 11: humanfactors_classAnly

7

Substandard Conditions of OperatorsAdverse mental states. Being prepared mentally is

critical in nearly every endeavor, but perhaps evenmore so in aviation. As such, the category of AdverseMental States was created to account for those mentalconditions that affect performance (Table 2). Princi-pal among these are the loss of situational awareness,task fixation, distraction, and mental fatigue due tosleep loss or other stressors. Also included in thiscategory are personality traits and pernicious atti-tudes such as overconfidence, complacency, and mis-placed motivation.

Predictably, if an individual is mentally tired forwhatever reason, the likelihood increase that an errorwill occur. In a similar fashion, overconfidence andother pernicious attitudes such as arrogance andimpulsivity will influence the likelihood that a viola-tion will be committed. Clearly then, any frameworkof human error must account for preexisting adversemental states in the causal chain of events.

Adverse physiological states. The second category,adverse physiological states, refers to those medical orphysiological conditions that preclude safe opera-tions (Table 2). Particularly important to aviation aresuch conditions as visual illusions and spatial disori-entation as described earlier, as well as physical fa-tigue, and the myriad of pharmacological and medicalabnormalities known to affect performance.

The effects of visual illusions and spatial disorien-tation are well known to most aviators. However, lesswell known to aviators, and often overlooked are theeffects on cockpit performance of simply being ill.Nearly all of us have gone to work ill, dosed withover-the-counter medications, and have generallyperformed well. Consider however, the pilot suffer-ing from the common head cold. Unfortunately,most aviators view a head cold as only a minorinconvenience that can be easily remedied usingover-the counter antihistamines, acetaminophen, andother non-prescription pharmaceuticals. In fact, when

Page 12: humanfactors_classAnly

8

confronted with a stuffy nose, aviators typically areonly concerned with the effects of a painful sinusblock as cabin altitude changes. Then again, it is notthe overt symptoms that local flight surgeons areconcerned with. Rather, it is the accompanying innerear infection and the increased likelihood of spatialdisorientation when entering instrument meteoro-logical conditions that is alarming - not to mentionthe side-effects of antihistamines, fatigue, and sleeploss on pilot decision-making. Therefore, it is incum-bent upon any safety professional to account for thesesometimes subtle medical conditions within the causalchain of events.

Physical/Mental Limitations. The third, and final,substandard condition involves individual physical/mental limitations (Table 2). Specifically, this cat-egory refers to those instances when mission require-ments exceed the capabilities of the individual at thecontrols. For example, the human visual system isseverely limited at night; yet, like driving a car,drivers do not necessarily slow down or take addi-tional precautions. In aviation, while slowing downisn’t always an option, paying additional attention tobasic flight instruments and increasing one’s vigi-lance will often increase the safety margin. Unfortu-nately, when precautions are not taken, the result canbe catastrophic, as pilots will often fail to see otheraircraft, obstacles, or power lines due to the size orcontrast of the object in the visual field.

Similarly, there are occasions when the time re-quired to complete a task or maneuver exceeds anindividual’s capacity. Individuals vary widely in theirability to process and respond to information. Nev-ertheless, good pilots are typically noted for theirability to respond quickly and accurately. It is welldocumented, however, that if individuals are re-quired to respond quickly (i.e., less time is availableto consider all the possibilities or choices thoroughly),the probability of making an error goes up markedly.Consequently, it should be no surprise that whenfaced with the need for rapid processing and reactiontimes, as is the case in most aviation emergencies, allforms of error would be exacerbated.

In addition to the basic sensory and informationprocessing limitations described above, there are atleast two additional instances of physical/mentallimitations that need to be addressed, albeit they areoften overlooked by most safety professionals. Theselimitations involve individuals who simply are notcompatible with aviation, because they are either

unsuited physically or do not possess the aptitude tofly. For example, some individuals simply don’t havethe physical strength to operate in the potentiallyhigh-G environment of aviation, or for anthropo-metric reasons, simply have difficulty reaching thecontrols. In other words, cockpits have traditionallynot been designed with all shapes, sizes, and physicalabilities in mind. Likewise, not everyone has themental ability or aptitude for flying aircraft. Just asnot all of us can be concert pianists or NFL lineback-ers, not everyone has the innate ability to pilot anaircraft – a vocation that requires the unique abilityto make decisions quickly and respond accurately inlife threatening situations. The difficult task for thesafety professional is identifying whether aptitude mighthave contributed to the accident causal sequence.

Substandard Practices of OperatorsClearly then, numerous substandard conditions of

operators can, and do, lead to the commission ofunsafe acts. Nevertheless, there are a number ofthings that we do to ourselves that set up thesesubstandard conditions. Generally speaking, the sub-standard practices of operators can be summed up intwo categories: crew resource mismanagement andpersonal readiness.

Crew Resource Mismanagement. Good communi-cation skills and team coordination have been themantra of industrial/organizational and personnelpsychology for decades. Not surprising then, crewresource management has been a cornerstone of avia-tion for the last few decades (Helmreich & Foushee,1993). As a result, the category of crew resourcemismanagement was created to account for occur-rences of poor coordination among personnel. Withinthe context of aviation, this includes coordination bothwithin and between aircraft with air traffic controlfacilities and maintenance control, as well as with facil-ity and other support personnel as necessary. But air-crew coordination does not stop with the aircrew inflight. It also includes coordination before and after theflight with the brief and debrief of the aircrew.

It is not difficult to envision a scenario where thelack of crew coordination has led to confusion andpoor decision making in the cockpit, resulting in anaccident. In fact, aviation accident databases arereplete with instances of poor coordination amongaircrew. One of the more tragic examples was thecrash of a civilian airliner at night in the FloridaEverglades in 1972 as the crew was busily trying to

Page 13: humanfactors_classAnly

9

troubleshoot what amounted to a burnt out indicatorlight. Unfortunately, no one in the cockpit was moni-toring the aircraft’s altitude as the altitude hold wasinadvertently disconnected. Ideally, the crew wouldhave coordinated the trouble-shooting task ensuringthat at least one crewmember was monitoring basicflight instruments and “flying” the aircraft. Tragi-cally, this was not the case, as they entered a slow,unrecognized, descent into the everglades resultingin numerous fatalities.

Personal Readiness. In aviation, or for that matterin any occupational setting, individuals are expectedto show up for work ready to perform at optimallevels. Nevertheless, in aviation as in other profes-sions, personal readiness failures occur when indi-viduals fail to prepare physically or mentally for duty.For instance, violations of crew rest requirements,bottle-to-brief rules, and self-medicating all will af-fect performance on the job and are particularlydetrimental in the aircraft. It is not hard to imaginethat, when individuals violate crew rest requirements,they run the risk of mental fatigue and other adversemental states, which ultimately lead to errors andaccidents. Note however, that violations that affectpersonal readiness are not considered “unsafe act,violation” since they typically do not happen in thecockpit, nor are they necessarily active failures withdirect and immediate consequences.

Still, not all personal readiness failures occur as aresult of violations of governing rules or regulations.For example, running 10 miles before piloting anaircraft may not be against any existing regulations,yet it may impair the physical and mental capabilitiesof the individual enough to degrade performance andelicit unsafe acts. Likewise, the traditional “candy barand coke” lunch of the modern businessman maysound good but may not be sufficient to sustain

performance in the rigorous environment of avia-tion. While there may be no rules governing suchbehavior, pilots must use good judgment when de-ciding whether they are “fit” to fly an aircraft.

Unsafe Supervision

Recall that in addition to those causal factorsassociated with the pilot/operator, Reason (1990)traced the causal chain of events back up the supervi-sory chain of command. As such, we have identifiedfour categories of unsafe supervision: inadequatesupervision, planned inappropriate operations, fail-ure to correct a known problem, and supervisoryviolations (Figure 4). Each is described briefly below.

Inadequate Supervision. The role of any supervisoris to provide the opportunity to succeed. To do this,the supervisor, no matter at what level of operation,must provide guidance, training opportunities, lead-ership, and motivation, as well as the proper rolemodel to be emulated. Unfortunately, this is notalways the case. For example, it is not difficult toconceive of a situation where adequate crew resourcemanagement training was either not provided, or theopportunity to attend such training was not affordedto a particular aircrew member. Conceivably, aircrewcoordination skills would be compromised and if theaircraft were put into an adverse situation (an emer-gency for instance), the risk of an error being com-mitted would be exacerbated and the potential for anaccident would increase markedly.

In a similar vein, sound professional guidance andoversight is an essential ingredient of any successfulorganization. While empowering individuals to makedecisions and function independently is certainlyessential, this does not divorce the supervisor fromaccountability. The lack of guidance and oversight

Page 14: humanfactors_classAnly

10

has proven to be the breeding ground for many of theviolations that have crept into the cockpit. As such,any thorough investigation of accident causal factorsmust consider the role supervision plays (i.e., whetherthe supervision was inappropriate or did not occur atall) in the genesis of human error (Table 3).

Planned Inappropriate Operations. Occasionally,the operational tempo and/or the scheduling of air-crew is such that individuals are put at unacceptablerisk, crew rest is jeopardized, and ultimately perfor-mance is adversely affected. Such operations, thougharguably unavoidable during emergencies, are unac-ceptable during normal operations. Therefore, thesecond category of unsafe supervision, planned inap-propriate operations, was created to account for thesefailures (Table 3).

Take, for example, the issue of improper crewpairing. It is well known that when very senior,dictatorial captains are paired with very junior, weakco-pilots, communication and coordination prob-lems are likely to occur. Commonly referred to as thetrans-cockpit authority gradient, such conditionslikely contributed to the tragic crash of a commercialairliner into the Potomac River outside of Washing-ton, DC, in January of 1982 (NTSB, 1982). In thataccident, the captain of the aircraft repeatedly re-buffed the first officer when the latter indicated thatthe engine instruments did not appear normal. Un-daunted, the captain continued a fatal takeoff in icing

conditions with less than adequate takeoff thrust.The aircraft stalled and plummeted into the icy river,killing the crew and many of the passengers.

Clearly, the captain and crew were held account-able. They died in the accident and cannot shed lighton causation; but, what was the role of the supervi-sory chain? Perhaps crew pairing was equally respon-sible. Although not specifically addressed in the report,such issues are clearly worth exploring in many acci-dents. In fact, in that particular accident, severalother training and manning issues were identified.

Failure to Correct a Known Problem. The thirdcategory of known unsafe supervision, Failed to Cor-rect a Known Problem, refers to those instances whendeficiencies among individuals, equipment, trainingor other related safety areas are “known” to thesupervisor, yet are allowed to continue unabated(Table 3). For example, it is not uncommon foraccident investigators to interview the pilot’s friends,colleagues, and supervisors after a fatal crash only tofind out that they “knew it would happen to himsome day.” If the supervisor knew that a pilot wasincapable of flying safely, and allowed the flightanyway, he clearly did the pilot no favors. The failureto correct the behavior, either through remedial train-ing or, if necessary, removal from flight status, essen-tially signed the pilot’s death warrant - not to mentionthat of others who may have been on board.

Page 15: humanfactors_classAnly

11

Likewise, the failure to consistently correct or disci-pline inappropriate behavior certainly fosters an unsafeatmosphere and promotes the violation of rules. Avia-tion history is rich with by reports of aviators who tellhair-raising stories of their exploits and barnstorminglow-level flights (the infamous “been there, done that”).While entertaining to some, they often serve to promul-gate a perception of tolerance and “one-up-manship”until one day someone ties the low altitude flight recordof ground-level! Indeed, the failure to report theseunsafe tendencies and initiate corrective actions is yetanother example of the failure to correct known problems.

Supervisory Violations. Supervisory violations, on theother hand, are reserved for those instances when exist-ing rules and regulations are willfully disregarded bysupervisors (Table 3). Although arguably rare, supervi-sors have been known occasionally to violate the rulesand doctrine when managing their assets. For instance,there have been occasions when individuals werepermitted to operate an aircraft without current quali-fications or license. Likewise, it can be argued thatfailing to enforce existing rules and regulations or flaunt-ing authority are also violations at the supervisory level.While rare and possibly difficult to cull out, suchpractices are a flagrant violation of the rules and invari-ably set the stage for the tragic sequence of events thatpredictably follow.

Organizational Influences

As noted previously, fallible decisions of upper-levelmanagement directly affect supervisory practices, aswell as the conditions and actions of operators. Unfor-tunately, these organizational errors often go unnoticedby safety professionals, due in large part to the lack of aclear framework from which to investigate them. Gen-erally speaking, the most elusive of latent failures revolvearound issues related to resource management, organi-zational climate, and operational processes, as detailedbelow in Figure 5.

Resource Management. This category encompassesthe realm of corporate-level decision making regard-ing the allocation and maintenance of organizationalassets such as human resources (personnel), monetaryassets, and equipment/facilities (Table 4). Generally,corporate decisions about how such resources shouldbe managed center around two distinct objectives –the goal of safety and the goal of on-time, cost-effective operations. In times of prosperity, bothobjectives can be easily balanced and satisfied in full.However, as we mentioned earlier, there may also betimes of fiscal austerity that demand some give andtake between the two. Unfortunately, history tells usthat safety is often the loser in such battles and, assome can attest to very well, safety and training areoften the first to be cut in organizations havingfinancial difficulties. If cutbacks in such areas are toosevere, flight proficiency may suffer, and the bestpilots may leave the organization for greener pastures.

Excessive cost-cutting could also result in reducedfunding for new equipment or may lead to the pur-chase of equipment that is sub optimal and inad-equately designed for the type of operations flown bythe company. Other trickle-down effects includepoorly maintained equipment and workspaces, andthe failure to correct known design flaws in existingequipment. The result is a scenario involving unsea-soned, less-skilled pilots flying old and poorly main-tained aircraft under the least desirable conditionsand schedules. The ramifications for aviation safetyare not hard to imagine.

Climate. Organizational Climate refers to a broadclass of organizational variables that influence workerperformance. Formally, it was defined as the“situationally based consistencies in the organization’streatment of individuals” (Jones, 1988). In general,however, organizational climate can be viewed as theworking atmosphere within the organization. Onetelltale sign of an organization’s climate is its structure,

Page 16: humanfactors_classAnly

12

Page 17: humanfactors_classAnly

13

as reflected in the chain-of-command, delegation ofauthority and responsibility, communication chan-nels, and formal accountability for actions (Table 4).Just like in the cockpit, communication and coordi-nation are vital within an organization. If manage-ment and staff within an organization are notcommunicating, or if no one knows who is in charge,organizational safety clearly suffers and accidents dohappen (Muchinsky, 1997).

An organization’s policies and culture are alsogood indicators of its climate. Policies are officialguidelines that direct management’s decisions aboutsuch things as hiring and firing, promotion, reten-tion, raises, sick leave, drugs and alcohol, overtime,accident investigations, and the use of safety equip-ment. Culture, on the other hand, refers to theunofficial or unspoken rules, values, attitudes, be-liefs, and customs of an organization. Culture is “theway things really get done around here.”

When policies are ill-defined, adversarial, or con-flicting, or when they are supplanted by unofficialrules and values, confusion abounds within the orga-nization. Indeed, there are some corporate managerswho are quick to give “lip service” to official safetypolicies while in a public forum, but then overlooksuch policies when operating behind the scenes.However, the Third Law of Thermodynamics tells usthat, “order and harmony cannot be produced bysuch chaos and disharmony”. Safety is bound tosuffer under such conditions.

Operational Process. This category refers to corporatedecisions and rules that govern the everyday activitieswithin an organization, including the establishmentand use of standardized operating procedures and for-mal methods for maintaining checks and balances (over-sight) between the workforce and management. Forexample, such factors as operational tempo, time pres-sures, incentive systems, and work schedules are allfactors that can adversely affect safety (Table 4). Asstated earlier, there may be instances when those withinthe upper echelon of an organization determine that itis necessary to increase the operational tempo to a pointthat overextends a supervisor’s staffing capabilities.Therefore, a supervisor may resort to the use of inad-equate scheduling procedures that jeopardize crew restand produce sub optimal crew pairings, putting aircrewat an increased risk of a mishap. However, organiza-tions should have official procedures in place toaddress such contingencies as well as oversight pro-grams to monitor such risks.

Regrettably, not all organizations have these pro-cedures nor do they engage in an active process ofmonitoring aircrew errors and human factor prob-lems via anonymous reporting systems and safetyaudits. As such, supervisors and managers are oftenunaware of the problems before an accident occurs.Indeed, it has been said that “an accident is oneincident to many” (Reinhart, 1996). It is incumbentupon any organization to fervently seek out the “holesin the cheese” and plug them up, before they create awindow of opportunity for catastrophe to strike.

CONCLUSION

It is our belief that the Human Factors Analysisand Classification System (HFACS) frameworkbridges the gap between theory and practice by pro-viding investigators with a comprehensive, user-friendly tool for identifying and classifying the humancauses of aviation accidents. The system, which isbased upon Reason’s (1990) model of latent andactive failures (Shappell & Wiegmann, 1997a), en-compasses all aspects of human error, including theconditions of operators and organizational failure.Still, HFACS and any other framework only contrib-utes to an already burgeoning list of human errortaxonomies if it does not prove useful in the opera-tional setting. In these regards, HFACS has recentlybeen employed by the U.S. Navy, Marine Corps,Army, Air Force, and Coast Guard for use in aviationaccident investigation and analysis. To date, HFACShas been applied to the analysis of human factors datafrom approximately 1,000 military aviation acci-dents. Throughout this process, the reliability andcontent validity of HFACS has been repeatedly testedand demonstrated (Shappell & Wiegmann, 1997c).

Given that accident databases can be reliably ana-lyzed using HFACS, the next logical question iswhether anything unique will be identified. Earlyindications within the military suggest that theHFACS framework has been instrumental in theidentification and analysis of global human factorssafety issues (e.g., trends in aircrew proficiency;Shappell, et al., 1999), specific accident types (e.g.,controlled flight into terrain, CFIT; Shappell &Wiegmann, 1997b), and human factors problemssuch as CRM failures (Wiegmann & Shappell, 1999).Consequently, the systematic application of HFACSto the analysis of human factors accident data hasafforded the U.S. Navy/Marine Corps (for which the

Page 18: humanfactors_classAnly

14

original taxonomy was developed) the ability to de-velop objective, data-driven intervention strategies.In a sense, HFACS has illuminated those areas ripefor intervention rather than relying on individualresearch interests not necessarily tied to saving livesor preventing aircraft losses.

Additionally, the HFACS framework and the in-sights gleaned from database analyses have been usedto develop innovative accident investigation meth-ods that have enhanced both the quantity and qualityof the human factors information gathered duringaccident investigations. However, not only are safetyprofessionals better suited to examine human error inthe field but, using HFACS, they can now track thoseareas (the holes in the cheese) responsible for theaccidents as well. Only now is it possible to track thesuccess or failure of specific intervention programsdesigned to reduce specific types of human error andsubsequent aviation accidents. In so doing, researchinvestments and safety programs can be either read-justed or reinforced to meet the changing needs ofaviation safety.

Recently, these accident analysis and investigativetechniques, developed and proven in the military,have been applied to the analysis and investigation ofU.S. civil aviation accidents (Shappell & Wiegmann,1999). Specifically, the HFACS framework is cur-rently being used to systematically analyze both com-mercial and General Aviation accident data to explorethe underlying human factors problems associatedwith these events. The framework is also being em-ployed to develop improved methods and techniquesfor investigating human factors issues during actualcivil aviation accident investigations by Federal Avia-tion Administration and National TransportationSafety Board officials. Initial results of this projecthave begun to highlight human factors areas in needof further safety research. In addition, like theirmilitary counterparts, it is anticipated that HFACSwill provide the fundamental information and toolsneeded to develop a more effective and accessiblehuman factors accident database for civil aviation.

In summary, the development of the HFACSframework has proven to be a valuable first step in theestablishment of a larger military and civil aviationsafety program. The ultimate goal of this, and anyother, safety program is to reduce the aviation accidentrate through systematic, data-driven investment.

REFERENCES

Bird, F. (1974). Management guide to loss control. At-lanta, GA: Institute Press.

Heinrich, H.W., Petersen, D., & Roos, N. (1980).Industrial accident prevention: A safety manage-ment approach (5th ed.). New York: McGraw-Hill.

Helmreich, R.L., & Foushee, H.C. (1993). Why crewresource management? Empirical and theoreticalbases of human factors training in aviation. InE.L. Wiener, B.G. Kanki, & R.L. Helmreich(Eds.), Cockpit resource management (pp. 3-45).San Diego, CA: Academic Press.

Jones, A.P. (1988). Climate and measurement of con-sensus: A discussion of “organizational climate.”In S.G. Cole, R.G. Demaree & W. Curtis, (Eds.),Applications of Interactionist Psychology: Essays inHonor of Saul B. Sells (pp. 283-90). Hillsdale, NJ:Earlbaum.

Murray, S.R. (1997). Deliberate decision making byaircraft pilots: A simple reminder to avoid deci-sion making under panic. The International Jour-nal of Aviation Psychology, 7, 83-100.

Muchinsky, P.M. (1997). Psychology applied to work(5th ed.). Pacific Grove, CA: Brooks/Cole Pub-lishing Co.

National Transportation Safety Board. (1982). AirFlorida, Inc., Boeing 737-222, N62AF, Collisionwith 14th Street bridge, near Washington NationalAirport, Washington, D.C., January 13, 1982(Tech. Report NTSB-AAR-82-8). Washington:National Transportation Safety Board.

Orasanu, J.M. (1993). Decision-making in the cock-pit. In E.L. Wiener, B.G. Kanki, and R.L.Helmreich (Eds.), Cockpit resource management(pp. 137-72). San Diego, CA: Academic Press.

Rasmussen, J. (1982). Human errors: A taxonomy fordescribing human malfunction in industrial in-stallations. Journal of Occupational Accidents, 4,311-33.

Reason, J. (1990). Human error. New York: CambridgeUniversity Press.

Reinhart, R.O. (1996). Basic flight physiology (2nd ed.).New York: McGraw-Hill.

Page 19: humanfactors_classAnly

15

Senders, J.W., and Moray, N.P. (1991). Human error:Cause, prediction and reduction. Hillsdale, NJ:Earlbaum.

Shappell, S.A., and Wiegmann, D.A. (1996). U.S.naval aviation mishaps 1977-92: Differences be-tween single- and dual-piloted aircraft. Aviation,Space, and Environmental Medicine, 67, 65-9.

Shappell, S.A. and Wiegmann D.A. (1997a). A humanerror approach to accident investigation: The tax-onomy of unsafe operations. The InternationalJournal of Aviation Psychology, 7, 269-91.

Shappell, S.A. & Wiegmann, D.A. (1997b). Whywould an experienced aviator fly a perfectlygood aircraft into the ground? In Proceedings ofthe Ninth International Symposium on AviationPsychology, (pp. 26-32). Columbus, OH: TheOhio State University.

Shappell, S.A. and Wiegmann, D.A. (1997). A reliabil-ity analysis of the Taxonomy of Unsafe Opera-tions. Aviation, Space, and Environmental Medi-cine, 68, 620.

Shappell, S.A. and Wiegmann, D.A. (1999a). Humanerror in commercial and corporate aviation: Ananalysis of FAR Part 121 and 135 mishaps usingHFACS. Aviation, Space, and Environmental Medi-cine, 70, 407.

Shappell, S., Wiegmann, D., Fraser, J., Gregory, G.,Kinsey, P., and Squier, H (1999b). Beyond mis-hap rates: A human factors analysis of U.S. Navy/Marine Corps TACAIR and rotary wing mishapsusing HFACS. Aviation, Space, and Environmen-tal Medicine, 70, 416-17.

Wiegmann, D.A. and Shappell, S.A. (1997). Humanfactors analysis of post-accident data: Applyingtheoretical taxonomies of human error. The Inter-national Journal of Aviation Psychology, 7, 67-81.

Wiegmann, D.A. and Shappell, S.A. (1999). Humanerror and crew resource management failures inNaval aviation mishaps: A review of U.S. NavalSafety Center data, 1990-96. Aviation, Space, andEnvironmental Medicine, 70, 1147-51.