Archives of Personal Papers ex libris Ludwig Benner, Jr.
   - - - - - -Last updated on Wednesday, May 19, 2004
[Return to Home Page ]    [ Investigation Research Roundtable ]   [ E-mail comments to Host ]

This paper was presented at the International Society of Air Safety Investigators 18th Annual Seminar, Washington, DC 1981 and published in the Proceedings of that seminar.


Methodological Biases Which
Undermine Accident Investigations

Ludwig Benner, Jr. M02202
National Transportation Safety Board
University of Southern California

ABSTRACT

Strongly differing opinions about accident causation among conscientious, well-intentioned accident investigators frequently arise in accident investigations. These differences can complicate investigations, frustrate investigators, undermine the credibility of investigators' work in the eyes of the public and others, and delay or misdirect safety improvements. This paper explores reasons why these differences occur. It is a status report of ongoing research into accident investigation theory, principles and practices in support of advanced accident investigation courses conducted for the University of Southern California.

The research findings reported here indicate ways to reduce controversy about accident investigations, and improve their contribution to our nation's well-being. The purpose of this paper is to share my findings in the hope they will lead to an improved accident investigation methodology that encourages meticulously disciplined. harmonious "win/win" accident investigations. regardless of the subsequent interests of the parties.

The research findings and conclusions reported are solely the author's, and do not necessarily reflect the views of the University, the National Transportation Safety Board or any other organization with which the author is or has been affiliated. The author accepts all responsibility for the contents and conclusions reported.

Is There Really An Investigative Problem?

It seems wherever one looks these days, one sees evidence of controversy about accident investigations. The Summer 1981 forum reports on such a controversy between two investigative bodies in New Zealand.[1] The international Journal of insurance and risk management reports that the UK Department of Trade has issued a strongly worded report disagreeing with Spanish investigators' report of the Dan Air crash last year[2] In the USA, ALPA continues to take issue with a National Transportation Safety Board report of a 1978 accident.[3]' During research efforts, differences of opinion cannot be conclusively resolved with one of the more highly respected aircraft accident investigation data bases available[4].

Controversy is not confined to aviation. At least four reports of the Three Mile Island nuclear plant accident were published, each presented differing views.[5] Jurors relate to me their personal uncertainties in arriving at accident case

decisions. Litigation abounds. Investigators quarrel. Students bring frequent examples of controversial conclusions from accident investigations to my classes. I regularly encounter views about how accidents should be investigated and reported that are very different from my own even among ISASI members. Jerry Bruggink and I, for example, were unable to reconcile our different professional views before he retired. One author describes 20 different analytical approaches in a new accident investigation book.[6]

Every experienced investigator recognizes that differences exist. In my view, their consequences can be significant in terms of investigations. administration of justice to individuals and organizations, money that changes hands, and even public confidence in the results of investigations. Investigations can stretch out. Investigative costs can escalate. Recommendations for corrective action can be delayed, or misdirected. Blame or fault can be laid on the wrong persons. Licenses or reputations can be unfairly Jeopardized. One party may inappropriately have to bear the accident costs. These are no small matters to the individuals directly involved!

Why can't investigative differences be reconciled more easily? Is it solely a matter of the money or reputations at stake? Or is there some technical problem that ISASI members could attack to overcome these differences?

My research suggests that ISASI members can do something constructive, if we will recognize why the differences exist, and act in concert to overcome them.

Summary of Findings

My research findings lead me to conclude that most differences arise because:

  1. .Investigators unconsciously base their investigative methods on methodologies adapted from their academic disciplines or previous work experience; this leads to highly individualized, personalized investigative methodologies;
  2. Adaptations of an individual investigator's methodologies lead to differences in "tests" for technical truth used by each investigator in accident investigations;
  3. Differences among tests for investigative "truth" make it hard for investigators to work together, and lead to differing conclusions by each investigator;
  4. The lack of commonly accepted investigative truth tests allows each investigator to incorporate untested descriptive and Judgmental materials into an accident report, increasing the potential for subsequent disagreements.

These findings highlight the need to develop a generally acceptable investigative methodology with methods for testing technical truth during investigations.

Today no mechanism exists to delineate. report and evaluate the differing methodologies used in accident investigations. Thus investigators have no basis for picking a "best" methodology. and few incentives to improve their own until they personally get caught up in a controversy. By that time, it is often too late-the battle is raging, often with us in the middle.

Let's look at these points one at a time. But before we do, let's make sure we are working with a common perception of at least one term:

What Is A Methodology?

Until I differentiated methodology from method during the research, I didn't really appreciate the significance of some of the things that were happening. Recognition of the issues we will explore in this paper depends in part on awareness of differences between "methodologies" and "method."

Let's consider the term "methods" first. As you consider the findings, think of method as being a regular, disciplined and systematic procedure for accomplishing a task. A method is a technique. During an investigation, investigators use different "methods" or techniques to interview witnesses, calculate flight paths, examine debris, read out data from records, and even to structure the participation of other investigators. Method emphasizes procedures according to an underlying. detailed, logically-ordered plan.

Methodology, on the other hand, has a broader context. A methodology is a system of principles, practices and body of procedures (methods) applied to a specific branch of knowledge determining in large measure how that branch of knowledge is practiced. A methodology is an overall approach to a field. The term "methodology" was selected for this paper because the subject of my research is the broader systemic approach to the accident investigation field, rather than individual procedures or methods.

The Origins of Modern Investigative Methodologies

As I became conscious of the methodological differences in accident investigations, I began looking for the reasons they existed. I observed that most investigators got into the accident investigation field through other fields. My own experience encompasses engineering, management and consulting. Some accident investigators have been or still are pilots. Some. engineers. Some, lawyers. Some, psychologists. Some, policemen. Some have safety experience. However, I have yet to meet the first investigator who decided to become an accident investigator and then engaged in a course of study with accident investigation as its major academic discipline.

Each of us has brought to the accident investigation field our previous academic or work disciplines. That background is unique to each of us. The different methods we developed through our academic pursuits and work or other experiences were the methods we felt comfortable adapting to investigations. As we continued our investigative work, we developed a body of investigative methods that, taken together, constitute our personal investigative methodology. When one examines the methodologies at work in investigations. at least six distinctive general methodological approaches can be observed.

Six General Accident Investigation Methodologies

The six methodologies or bodies of methods are listed below. Each methodology has characteristics and uses truth tests that are distinctive from all the others. Although the classification does not result in completely exclusive classes, the categories help understand investigative disagreements. The methodologies, in the order I recognized them, are:

  1. "common sense,"
  2. safety,
  3. engineering,
  4. statistical,
  5. adversary, and
  6. symbolic modeling.

Let's look at each, in terms of what it is and the truth tests it imposes.

1. The catchall "common sense" has been used to describe the unstructured methodology for investigations that have been observed among some investigators. My first experience with highway accidents typified this approach. If the explanation of the accident "made sense" it was acceptable. It incorporated Kipling's six faithful servants (who, what, when, where, how, and why), and the apparent nature human tendency to try to order events sequentially when we try to remember something.

Technical truth is Judged in the investigator's mind as s/he "reconstructs" the accident sequence.

2. The observed "safety" methodology was difficult to delineate. Much of the safety field has been dominated by H. W. Heinrich's philosophy since the 1930s. That philosophy is based on the "unsafe act" and "unsafe condition" approach to "prevention of accidents." My observations of investigations conducted by safety personnel suggested this dichotomous view on several occasions, especially in industrial accident investigations. The search for unsafe acts and conditions and their elimination to prevent future accidents explicitly drove the investigative efforts. It is a cause oriented, "single fix." retrospective approach. This methodological approach seems less prevalent, but still present. in general aviation investigative work today (pilot error, equipment failure), and it dominates much of the industrial safety field. especially among smaller concerns.

Technical truth is tested either against some coding standards, or ex post facto by how the investigator judges what happened against what the investigator considers "normal."

3. In a high technology environment, I next noted engineering methodologies drove accident investigators. Engineers were interested in the application of empirically-derived principles to the design, construction and operation of a working, productive facility or system. Observations of engineers' efforts in accident investigations suggests they are more interested in understanding how the accident occurred than whose fault it was and the "cause." Their methods reflect engineering approaches to studying the behavior of related components involved, and learning from these relationships how to produce better products.

The engineers' test for technical truth is "did it work" reliably the next time (or the familiar fly-fix-fly approach). Ergonomics, crashtesting, and operational factors efforts seem to be examples of investigative work based on engineering methodologies.

4. The statistical methodological approach to investigation has been observed, too, most extensively in the highway field. It seeks data from accidents that can be used to hypothesize causes and causal factors. It can be detected when forms are used during investigations and in the secondary statistical analysis work performed on accident data both intended to confirm investigators' hypotheses. The goals of statistical inquiry are the identification of determinant variables, and from their isolation, the prediction and control of phenomena. The statistical methodology influences in large measure the data sought during light aircraft investigations, for example, and looks for "technical truth" after the investigation is concluded.

The statistical methodological approaches deal primarily with technical truth in terms of statistical tests of probabilities of factors present as "experimental" results. Human factors investigations, psychological autopsies, and biorhythm investigations seem to be examples of this methodological approach.

5. My observations in the aviation accident investigation field suggest that most major accident investigations - whether for accident prevention or other purposes - are driven by "adversary" methodologies. This methodological approach can be observed most clearly in two processes: the US "party system" of investigations, and the commission-type inquiries used in some countries. The influence of legal concepts, principles and rules of procedure and form of the final work products is dominant. In practice, the two processes seem to rely heavily on the adverse interests of parties to the investigation to bring hypotheses to light, rebut adverse views and present the strongest technical evidence for a favorable determination of "cause," or blame. During the research, it has also been noted that the principal effort during the processes was directed at determination of "cause(s)" and their elimination. During discussions with investigators, the terms fault and blame were often used. The many common points of the investigative proceedings and legal proceedings are readily recognized by anyone who has studied or practiced both. In view of government's role in the development of these processes, the similarity should not be a surprise.

Technical truth is reached through the adversarial development of relevant evidence from which reasoned conclusions are logically drawn.

6. My observations also suggest another methodological framework that is quite advanced in some fields. My term for that methodology is symbolic modeling. Mathematics and music probably are the most advanced examples of this methodological approach. Symbolic representations that permit recording, study, analysis, understanding, replication and manipulation of phenomena are their goals. Fault trees are another, more frequently encountered example of this methodological approach in the safety field. Symbolic modeling has been observed in accident investigations but was not generally considered a separate OVERALL methodology. For reasons that have been detailed in an earlier paper:[7] my view is that it should be treated distinctively as an overall methodology. It demands a finite beginning and end to the investigation and an ordered display of the accident mechanism that facilitates information exchanges and problem resolution.

Technical truth is developed by logically linking the flow of events among concretely defined accident elements (actors) in a timed matrix format. In investigations, the logical events flows are temporarily and spatially tested in the matrix, as soon as data is acquired.

There, briefly, are the 6 general methodologies that seem to be driving accident investigations. I have seen them all exerting influence on accident investigations and accident reports. I have seen as many as four present in a single investigation. Ted Ferry, in his new book, has a list of 20 different methods for analyzing accidents.

The Impact on Truth Tests

Each methodological approach involves differing sets of assumptions, concepts, principles, "laws," and procedures (methods) to arrive at the scope and technical truth about an accident. When more than one methodology is present in the investigation of the same phenomenon, we begin to encounter trouble because each methodology calls for differing

  • accident scope, to which technical truth will be applied;
  • investigative methods, used to arrive at the technical truth;
  • accident data, sought to establish the technical truth;
  • truth tests, applied to the investigative data to establish the actual scenario for the accident; and
  • the likelihood that investigator's conclusions or assertions are reported as facts.

Are these differences important? My answer is an unequivocal YES. My research indicates that their importance lies in the different investigative demands they impose on each investigator. If you are working within a symbolic logic framework which provides a "win/win" investigative environment leading to understanding of the accident, imagine your frustration in trying to get cooperation from another investigator who is working within

  • an engineering framework, looking for engineering or debris testing proofs or
  • an adversary framework, looking for fault or culpability or<
  • a safety framework, looking for a cause or unsafe condition to correct to prevent similar accidents, or
  • a statistical framework looking for "all the facts" for later analysis and use

.

Forty four reported reasons for investigating accidents were previously reported.[8] Since February 1980, five more have been identified by some of my students. Many reasons are incompatible, as just shown. Disagreements arising from these cross purposes, however, are merely symptoms of the methodological differences which unintentionally bias investigators.

The Impact on Investigative Reports

As my research continues, and I understand more clearly these differing methodological frameworks and truth tests, and the differing demands they impose on investigators, another - and possibly more important - question has emerged. Why do we need so many investigations of the same accident? After all, the accident only occurred one time, in one way in which everything involved had a probability of 1 - it happened. Why are so many different purposes involved, and why are so many investigators and investigations needed? I have seen one accident in which 7 independent investigations were conducted. Why couldn't one investigation serve the needs of everyone involved?

One breakthrough in my research occurred as I worked with the truth tests to try to distinguish between "observed data" and "interpreted" data (investigator's conclusions) reported on accident investigation forms. One day, I suddenly realized that a single "time" reported on an accident form was an investigator's conclusion since a finite time period had actually elapsed during the accident. This led to the realization that much of the time investigators record their interpretation of the actual data, rather than the VALUE-FREE observations from the evidence. Much of the recorded data were investigator's conclusions - subjective personal judgments representing interpretations by the investigator. As the enormous significance of this distinction began to take shape, a lot of problems began to be understandable. If each investigator used different test criteria for making these judgments, reproducibility suffered. Discussions with investigators disclosed that - except in large accident investigations - the TECHNICAL truth of these CONCLUSIONS was UNTESTED!

Three grave problems began to dawn on me:

  1. In the absence of rigorous truth tests, we were dealing with a lot of subjective investigator's opinions on the forms, which meant that we were looking at and arguing opinions rather than "fact."
  2. "Observed data" about the accident were indiscriminately blended with the investigator's subjective opinions about the nature of the accident being reported;
  3. Uncertainties were almost never reported. even in some major accidents, allowing speculative assertions to pass unchallenged;
  4. Users of these kinds of reports were basing their work on UNTESTED opinions that, when tested.were fatally flawed in at least one respect - the absence of time relationships among events.

To sum up these problems simply, investigators were reporting arbitrarily selected data, blending it with untested subjective opinions or assertions, not mentioning the uncertainties in their reports, and then sending this information on its way to unwary secondary users! Any wonder arguments ensued?

From there, the search became more direct. The reasons for this state of affairs could be traced directly to the underlying investigative methodologies, their associated individualized "truth testing" techniques, and the general acceptance of these uncritical "truth tests" by secondary users. Greatly simplified, Common Sense accepts "sensible" explanations as true. Engineers regard something that can be tested and made to work as representing adequate truth. Statisticians rely on validation with probabilistic truth tests. The pure adversary methodology tends to recognize the "winner's" arguments as the most likely to represent the truth. Symbolic modelers regard logical, tested and displayed sequences as "truthful."

These realizations suggested that if the methodological differences can be recognized and resolved, a single truthful accident investigation report might not be a hopeless IDEAL. If methodological differences can be reconciled, and "truth tests" agreed upon, an ideal" single accident report to serve all subsequent purposes might become a reality.

What Next?

This leads to my last point. My research suggests that investigators initiate consideration of the development of TWO types of accident investigation reports.

The first type of report would be a DESCRIPTIVE accident report, in which the accident mechanism or scenario is described to the level of detail possible by the surviving evidence. The mechanisms would best be described by a graphic flowchart display. Narrative reports would be optional.

My experience with this methodology shows me that an entire set of accident events sequences could be developed cooperatively by anyone willing to make all available data available, capable and willing to support the investigation effort to its conclusion, and willing to accept a common INVESTIGATIVE METHODOLOGY. The investigative effort would allow investigators to use data acquisition methods currently in use. However, these efforts would be disciplined by an overall investigative framework with technical truth testing methods that would - at a minimum

  1. Force the organization and relevance testing of data as quickly as it is acquired, so known data and data gaps or uncertainties become visible quickly to all participating investigators;
  2. Provide accident events-flow truth testing methods during investigations;
  3. Compel use of agreed upon events-flow truth testing methods by all participants during the investigation;
  4. Require systematic structuring of anyone's speculations to avoid wild goose chases during the investigation;
  5. Delay the proposing of hypotheses to the evidence actually developed and structure "what if' reasoning to conform to facts at hand;
  6. Display the logic flow and uncertainties of the accident mechanism in a way that all investigators can agree to its "truth" before the investigation is terminated..

Such a method, originally derived from NTSB's aircraft accident FDR/CVR analytical techniques dating back to 1960, exists and has been used very successfully in non-aviation accident investigations. [9] [10]

The second type of report would be INTERPRETIVE reports based on the descriptive reports. My observations of disagreements suggest that disputes often arise with the interpretations placed on the accident data or mechanism after it has been reported. True, often the mechanism is uncertain or not fully developed. However, if the technical truth tests do not permit validation of the mechanism, the descriptive report would acknowledge the uncertainties. Then the interpretive report could acknowledge these gaps and could freely speculate about these uncertainties and gaps to the extent desired by their authors.

It will be alleged that reports which present "facts," "analysis" and "conclusions" today provide this descriptive and interpretive distinction. However, when these distinctions were tested in the classroom for technical truth with symbolic events modeling techniques now available, every report analyzed failed. We did not analyze any major catastrophic aviation accident reports. However, without exception, each report we did analyze contained one or more gaps or investigator's conclusions in the factual section, which affected the safety action taken. It isn't important whose reports we analyzed: every one had gaps in the description of the accident when rigorously tested, and fewer than 10% even mentioned any uncertainties.

This research sends me two loud, clear messages. As accident investigators, we all should strive for a generally acceptable INVESTIGATIVE methodological framework with technical truth tests that every participating investigator will use. Concurrently we should strive for the complete separation of descriptive and interpretive accident investigation reports. As Ben Franklin once said, we'd better hang together on these matters, or we'll all hang separately.

References

[1] The Editor's Cornered, ISASI forum, Vol. 14:2, Summer 1981.

[2] "Disagreement about Tenerife Crash," Foresight, August 1981

[3] Petitions from Air Line Pilots Association to the National Transportation Safety Board, June 9,1980 and June 15, 1981.

[4] The President's Task Force on Aircraft Crew Complement, Report of the Mishap Analysis Panel Working Group Concerning the Boeing Study, "Jet Transport Safety Record, May 5, 1981.

[5] (A) Report of the President's Commission on THE ACCIDENT AT THREE MILE ISLAND, The Need For Change: The Legacy of TMI, October, 1979.

(B) "Three Mile Island, A Report to the Commissioners and to the Public" by the Special Inquiry Group of the Nuclear Regulatory Commission, Mitchell Rogovin, Di-rector, May 1980.

(C) U.S. Nuclear Regulatory Commission, "Investigation into the March 28, 1979 Three Mile Island Accident by the Office of Inspection and Enforcement" (NUREGA)600 Draft), October 10, 1979.

(D) Electric Power Research Institute, "Analysis of Three Mile Island-Unit 2 Accident," NSAC- 1, July 1979.

(E) See list of relevant references in Miller, C.O., "Safety Management Factors Germane to the Nuclear Reactor Accident at Three Mile Island, March 28, 1979", Hazard Prevention, Vol.16, Special Issue, Summer, 1980.

[6] Ferry, T.S., "Modern Accident Investigation and Analysis: An Executive Guide," John Wiley and Sons, New York, 1981

[7] Benner, L., "Accident Theory and Accident Investigation," Proceedings of the ISASI Annual Seminar, Ottawa, Canada, 7-9 October, 1975.

[8] Benner, L., "Accident Perceptions: A Case for New Perceptions and Methodologies", SAE Transactions, Vol. 89, 1980.

[9] Ferry, op cit, p.166-172.

[10] NTSB "Survival in Hazardous Materials Accidents", Report HZM-80-4, 1980.

(B) NTSB "Phosphorus trichloride Release in Boston and Maine Yard 8 During Switching Operations, Somerville, Massachusetts, April 3,1980", Report HZM-81-1.