Why Smart Pilots Crash

Stress and the demands of multitasking are predictable distractions. Knowing solutions in advance helps simplify decision making.

0

I’ve noticed there is a bias, sometimes spoken aloud, that a pilot who made some sort of a mistake and had an accident was either not terribly bright, lacked basic skills or just plain didn’t have the magical “right stuff.” As an instrument instructor, I’ve certainly seen pilots with poor skills or who weren’t terribly bright or had lousy judgment, and some of them crashed an airplane. I’ve also seen some extraordinarily good pilots who were possessed of all the right stuff imaginable, who also made mistakes and crashed.

A few years ago a team at NASA did intense research into why smart pilots crash. The team, lead by NASA’s Chief Scientist for Aerospace Human Factors, Key Dismukes, himself an experienced pilot, looked at a series of airline accidents—although the accidents could well have been general aviation airplanes on IFR flight plans. The team published its results in a book, The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents. (A related presentation is online at: human-factors.arc.nasa.gov/flightcognition/article2.htm). It’s not only eye-opening—it takes a refreshingly common sense approach to understanding why smart, competent pilots crash and what can be done about it.

Accident Analysis

Reading the results, it’s clear that Dismukes and his team did their best to keep open minds, despite the fact the NTSB had already published a probable cause for each accident. I was struck by the emphasis they placed on avoiding “hindsight bias” during the examination of each accident. His team was very careful to look at each stage of each accident flight with only the information available to the pilot/flight crew at that point in the flight and not with the benefit of what was learned later in the flight, or after the accident.

When one looks at the spectrum of accidents in which a mistake on the part of a pilot was involved, the ones who made mistakes were not from the less-qualified or less-intelligent side of the bell curve. They were, as a whole, no better or no worse than anyone else. An error on the part of a pilot, by itself, is not de facto evidence of a lack of competence, skill or judgment.

What is interesting is that even though pilots make mistakes, they make them in a different way than do laypeople. Pilots have undergone a degree of training and education and may be considered experts. Experts, when faced with a task for which they have been trained, will perform the task with almost 100-percent reliability in routine conditions; much more accurately than a layperson. But, pilots, being human, make mistakes.

What Dismukes’ team found was that, as experts, pilots make mistakes in ways that can be predicted. Because of that, pilots can be educated as to the conditions in which they are more likely to make mistakes, how to recognize the onset of such conditions and how to be better able to either avoid making mistakes or to catch and correct them before they combine with other factors—or other mistakes—to create an accident chain.

The scientists also found that once a situation becomes challenging and a mistake is made, it’s not unusual for the situation to “snowball”—to go downhill faster and get bigger—and for the risk of further mistakes to increase.

Plan Continuation Bias

Dismukes’ team identified a factor that figured into a significant number of accidents and must be taken into account even though it is generally a good thing: the desire to complete the flight to the planned destination. Sometimes it is precisely the wrong thing to do when things are going south and the pilot’s cognitive ability is adversely affected by massive sensory overload.

The problem is that something called the “mission orientation/plan-continuation bias” means we tend to press on in spite of changing conditions. It also means the bias gets stronger as we get nearer to the completion of the activity, and it works to actively prevent us from noticing subtle clues that the conditions in place when we made our original plan have changed. A good example is the reluctance to carry out a missed approach when the weather has gone below minimums, and the willingness to go well below minimums in the hope of finding a runway.

Being determined to carry out a landing in spite of the fact that errors have a way of snowballing, thus inducing sensory overload, adversely affects our ability to think strategically and recognize that things have gone down the tubes. Things go bad so fast—creating such a high workload on top of the plan-continuation bias and sensory overload—that Dismukes concluded that even a well-trained pilot often cannot get beyond reactive thinking to the more difficult and higher level of cognitive activity—proactive/strategic thinking—and realize the only solution is to go around.

Inducing Mistakes by Pilots

So what are the factors that Dismukes’ team found to be likely to cause expert pilots to make mistakes that can then combine with those factors to lead to an accident? None of them will surprise you. The mistake inducers can almost be boiled down to one: stress.

A little stress is a good thing. Stress wakes us up, gets the blood flowing, causes us to be alert and ready to deal with what’s coming at us. But, at some point, stress does very bad things to a pilot’s ability to carry out the functions required to be a pilot. This is true no matter what the source. At some level it goes from good, to not so good and then just plain awful, and it “hampers skilled performance by narrowing attention and reducing working memory capacity.” To paraphrase Dismukes, we effectively get dumber through no fault of our own.

Just when we pilots need all of our facilities most—in extremely stressful, high-workload situations—those facilities progressively abandon us and our level of functioning becomes lower and lower on the cognitive scale, from strategic to tactical to reactive. That’s how we evolved. Our ancestors got into it with a woolly mammoth and survived because stress caused their brains to function in a reactive mode, completely concentrating on how to survive this one encounter and not to worry about how we might have made a better plan or that maybe we shouldn’t be hunting for this particular mammoth right now.

Most of the time the pilot’s heightened senses and reactions get through the immediate crisis. The problem lies in the ability, for example, to make the decision to abandon the approach to this airport that is on the edge of the thunderstorm while still outside the outer marker and on the edge of another boomer. When the pilot is under that much stress, the thinking process often gets reduced to a focus of just getting the airplane on the ground because there is just so much input that there is no cognitive capacity left to step up to a higher level of reasoning and decide that this plan stinks.

Under severe stress, we lose the ability to put the immediate situation aside, to look at it objectively and make overall command decisions.

We also progressively lose the ability to process information we receive. We may get a wind shear alert, with the controller giving us wind direction and velocity at locations around the runway, and even though every single one of them has a direction and velocity that exceeds the published limitations of the airplane, it’s very common for even the best pilot to be unable to process the information and realize that it means that it will be physically impossible to keep the airplane on the runway even if the pilot can touch down on it.

Combined with plan-continuation bias, under the stress of severe weather, with some fatigue tossed in, very good pilots have been known to keep on going in the midst of thunderstorms, toward a runway that has a direct crosswind far beyond the capability of the airplane, when turning 90 degrees would take them out of weather within two minutes and they would have time to settle down and decide whether to try the approach again or go elsewhere.

More Factors

As you should’ve guessed by now, weather is a factor. As weather gets worse, the propensity for pilots to err increases.

Another factor is time pressure, as is pilot workload. What Dismukes referred to as “concurrent task management” and workload issues appeared in the vast majority of accidents reviewed in the study. A pilot can absorb a certain amount of information in a given period of time. As stress increases or manual control of the airplane is difficult in questionable weather, and as the workload on the pilot increases, the ability to “integrate a high volume of diverse information to evaluate the situation” drops off.

Pilots are more likely to “fail to note subtle cues and are less able to integrate and interpret information from multiple sources. They may also revert to a reactive mode; rather than strategically managing the situation, they may respond to each demand as it arises, without prioritizing, because they lack sufficient free mental resources to take a strategic approach.”

Workload means “multitasking.” Pilots multitask all the time and, if you watch any television commercials, those who multitask are considered to be cool and “with it.” Yet we aren’t told the ugly truths about handling a lot of different things at once: There is an inherent difficulty in reliably switching attention back and forth between tasks, which is borne out in repeated studies. When we multitask, not only are we more likely to make a mistake on any one of the tasks, it takes us longer to accomplish all of them than if we had done them one at a time to completion.

Workload demands on pilots are such that we often do not have the luxury of performing one task to completion before moving on to the next. So when we face the reality of multitasking, we must be aware our risk of screwing up on something is higher. And because multitasking, by definition, means that one task is interrupted, it is more likely that we will forget to complete the interrupted task.

Surprise

The scientists found: “The combination of stress and surprise with requirements to respond rapidly and to manage several tasks concurrently, as occurred in several of these accidents, can be lethal.”

Being surprised means the risk of mistakes goes up. This is why a false alarm in a warning system or an incorrect cockpit display can trigger the mistake that starts the snowball that leads to an accident.

Interestingly enough, a pilot who goes through formal training for a procedure in a particular fashion and then learns that pilots in the field do it another way, or is told by a subsequent instructor not to do it the way he or she was formally trained, is more likely to make a mistake in performing that procedure when called to do so under a stressful situation.

Trying to do a procedure for the first time under stress (i.e., not having practiced that procedure) is a good predictor for making a mistake. But repetitious operations—things we do the same way every flight, such as checklists and briefings—have their own dangers in that pilots do them so often there is a risk, particularly if the task is interrupted, of “looking without seeing” and forgetting items. Pilots forget the landing gear, even when using the checklist. Further, there can be a tendency for repetitious operations to make pilots reactive rather than proactive.

Is There Any Hope?

So what is the poor slob in the left seat supposed to do? We get the impression that high workload, crummy weather, distractions, system failures, stress, fatigue—all the fun stuff a pilot deals with every day—are out there combining to reduce the level of intellectual functioning of pilots to the point they become blithering idiots who are lucky to ever place an airplane on a runway without creating modern art.

Not so. The reality is we humans handle all of this very well virtually all the time. What we need are tools to allow us to do it even better. Dismukes’ team came up with a number of ideas that can be applied by any pilot flying any sort of aircraft.

Having hard and fast bottom lines established well before a flight simplifies decision-making and makes it easier to think strategically throughout a flight. Having hard and fast rules simplify a pilot’s decision-making process and makes it easier for a pilot, especially when confronted by deteriorating weather and its associated demands, to abandon a plan rather than trying to force it to work.

When a pilot is consciously aware that his or her ability to think critically diminishes when workload and stress goes up and that there is a very real plan-continuation bias, that pilot is more likely to be able to be proactive, to step back and evaluate whether the original plan is still viable or whether it’s time to undertake one of the fallback plans.

Putting It Together

I like the idea someone has been willing to say that pilots are experts and are human, and therefore they make mistakes (no matter how well-trained, motivated, competent and capable). When such a reality is recognized, objective analysis is possible and workable ideas for improving our level of safety can be found.

LEAVE A REPLY

Please enter your comment!
Please enter your name here