Investigating Pilot Error in Plane Crashes: A Human Factors Analysis
Although horrific plane crashes continue to make the headlines and the media seems to agitate the public by suggesting otherwise, traveling by air remains the safest form of transportation. Period. According to data compiled by the Bureau of Aircraft Accidents Archive, there were 265 fatalities during air travel in 2013 and 794 in 2012. Contrast this with over 33,000 deaths from motor vehicle accidents last year alone. The International Civil Aviation Organization has estimated that almost 3 billion people traveled by air in 2012. Doing the math, one quickly sees that your chance of being one of the unlucky victims (794/3,000,000,000 = 0.00003%) of an airplane crash is vanishingly small. This means you have a greater than 99.999% chance of surviving any commercial flight you step onto. I’ll take those odds!
|Year||Deaths||# of Accidents|
How has aviation become so safe? And when a plane does crash, what actions take place to ensure future similar accidents are prevented? Well, aviation has obviously not always been a safe form of travel and frequent fliers today owe a great deal to aviation pioneers who sacrificed their lives to allow the industry to progress. Advances in safety have come as a consequence of a variety of factors – better understanding of aerodynamics, progress in engineering & manufacturing technique, strong regulating bodies to ensure and maintain safety standards, and an ability to investigate and accurately identify the causation for accidents. The ability to identify contributing and causal factors of past accidents has allowed aviation to make necessary changes to prevent history from repeating.
CAUSES IN AVIATION ACCIDENTS
The first fatal aviation accident killed a young Army lieutenant while being piloted by one of the Wright Brothers. This first mishap was solely a mechanical failure, which was common for early crashes. Even today, it remains relatively easy to identify the causes of maintenance and engineering failures through empiric observation. As technology and manufacturing technique progressed, the ability to analyze strain on metals, shear on bolts and fatigue on other aerospace equipment became highly objective and reproducible. And as the machines became more reliable mechanically and their operation became more complex, the human flying the machine became increasingly responsible for accidents. It is now thought that about 70-80% of all aviation accidents are caused by human (or pilot) error alone. In fact, ‘human error’ is so inevitable and ubiquitous that the term has been displaced in preference of ‘human factor’. Unfortunately, given the complexity of human psychology and behavior, identifiying and agreeing on definitions for human factors is not nearly as easy as describing metal stress or bolt shearing. A common definition of human factors has remained elusive for decades.
HUMAN FACTOR MODELS
Although covered in a separate post discussing the 1979 Disaster at Tenerife, which was the largest loss of life in any aviation accident to date, it is relevant to the discussion of this post to revisit the history of Human Factors Frameworks and Models.
There have been a variety of frameworks to describe the underlying process of human error. Highly reliable industries that experience constant threats to the safety of customers, employees or the public have embraced these models in an effort to better understand and prevent the serious accidents. One of the most well known of these frameworks is the Domino Theory of accident causation, which was created by early American industrial safety advocate Herbert Heinrich. This model suggested that injuries occurred due to one’s social environment, which he likened to the first domino in a series. Once this domino fell over it directly caused a series of other dominos to fall ultimately leading to an accident and subsequent injury. This model from the 1930’s was further developed by Frank Bird in the 1970’s by simply changing the names of some of the dominoes. Bird felt that the initial cause of most accidents was due to lack of management controls or poor management decisions. Therefore, the initial domino became a metaphor for ‘Absence of Safety Controls.’
In 1990, a paper published by James Reason argued for an updated framework to better understand accident causation. This approach has come to be known as the “Swiss Cheese Model” and although originally described for use in the field of nuclear energy, has been embraced by aviation, space, healthcare and other highly reliable organizations. Rather than focusing on the preceding event as a necessary cause for a given effect, the assumption in this model is that there always exist many threats for potential errors in complex organizations that at times occur independently of one another. Sometimes these effects are caused by preceding events, but not universally. The accident and subsequent injury only occurs if all of the ‘holes’ of the Swiss Cheese are aligned. If any barrier can disrupt this chain of events the accident will also be avoided.
The four overarching categories in the Swiss Cheese Model are:
- Organizational Influences
- Unsafe Supervision
- Preconditions for Unsafe Acts
- Unsafe Acts
These four broad categories provide a very useful framework, but it proved limited in its application in accident investigations. To be able to truly identify, and hopefully prevent, human factor causes in aviation mishaps each of the possible types of holes (threats) in the system need be identified. This recognized limitation and need led to the development of the Human Factors Analysis and Classification System (HFACS) by U.S. Navy aviation psychologists Douglas A. Wiegmann and Scott A. Shappell in the late 1990’s and early 2000’s.
HFACS was built upon the foundation of Reason’s Swiss Cheese Model by using the 4 main categories stated above, but then elaborates on a standardized glossary of very specific types of human factors that may result in injury or accident. Another huge advantage that the HFACS system has is that it is dynamic and responsive to change. Its creators have attempted to continually validate the framework through experimentation and empiric observation. The HFACS taxonomy has underwent a variety of modifications and iterations and in an effort tot produce a better product. Therefore, the original HFACS database has significant differences from its current state. Although this framework was initially developed and used by the U.S. Navy, its broad application and use today is proof of its perceived value. All branches of the U.S. Department of Defense (USN, USAF, US Army), NASA, the NTSB, international aviation organizations, and other industries now use the framework in accident investigations. As the framework is used more frequently, it actually becomes a better tool by tracking and storing data that can be later collected and analyzed.
Wiegmann & Shappell explain the usefulness of their creation best: “By using the HFACS framework for accident investigation, organizations are able to identify the breakdowns within the entire system that allowed an accident to occur. HFACS can also be used proactively by analyzing historical events to identify reoccurring trends in human performance and system deficiencies. Both of these methods will allow organizations to identify weak areas and implement targeted, data-driven interventions that will ultimately reduce accident and injury rates.”
At this point, the HFACS system may sound like a nebulous piece of academic dribble. Part II of this post, however, will recount a famous aviation accident and then apply the HFACS framework just as a human factors expert may do during an accident investigation!
More to Come…