There’s an old adage in aviation that reads, “There are two types of pilots: Those that have landed gear up, and those that will.” I’m not a big believer in these silly phrases — their purpose being to bring solace to the group of pilots that are most likely to make the mistake. That doesn’t mean any of us are immune from making errors, but I do think there’s a fraternity of pilots that are always far less likely to make the mistake than others. This post examines just a few reasons why we make obvious errors; most notably, gear-up landings.
First, look at this video of a group of basketball players and count exactly how many passes the team in white makes. It directly relates to the other two videos below.
This is one of a number of videos that are commonly used in human factors and Crew Resource Management courses to illustrate how fixation can distract us from more obvious (and more important) issues.
Now, watch these two videos. The first is an unintentional gear-up landing in a Cessna 172RG; the second is another unintentional gear-up landing a Trinidad.
Note the similarities? In both cases, the aircraft was high on an unstablized approach. In both cases there were two pilots occupying the front seats, and more disturbingly, the entire ‘crew’ didn’t acknowledge the ‘configuration’ warning horn.
In most light aircraft with a retractable undercarriage, the aircraft’s logic will identify that the machine isn’t correctly configured for the current phase of flight (in this case, flaps down with low power and no gear extended) and it will emit a warning horn to alert the pilot. In both of the above cases it’s possible that the crew didn’t even register the warning horn because of a fixation (or tunnel vision) on correcting what they believed to be the more immediate problem: the high approach. It’s also possible that they heard the horn but falsely determined that it was alerting them to another issues (such as simply low power). At the same time, they developed a blinkered approach to the world around them and harnessed virtually every square inch of brain capacity to deal with a problem that they weren’t familiar with. The situation led to a complete loss of situational awareness.
Confirmation bias (a type of cognitive bias) is an inherent and stubborn flaw in the human condition. We like to validate our decisions and actions without a broader consideration outside the diagnosis that we initially form early on – and we’re inherently biased and emotionally motivated to form a hypothesis to support an original belief. We sometimes become completely invested in a decision and consumed with whatever action is required to confirm our original opinion (aggravated by the fact that we don’t have spare brain capacity to question what we’re doing). The two cases above tends to support the psycho-babble that suggests confidence systematically exceeds accuracy; implying people are more sure that they are correct than they deserve to be… of that they become so task-saturated that they don’t have spare cognitive processing power to question their actions. This overconfidence leads to a situation where we form an irrational escalation of commitment, or commitment bias, leading to anchoring (or tunnel vision) that we described above. The longer we vacate dynamic lateral thinking and tolerate a cognitive dissonance, the more and more we commit to a bad decision and “make the wrong decision the right decision”.
The Illusion of Control
There’s some interesting discussion relating to low time pilots (or any pilot that lacks currency) and the illusion of control. The principle is based around the belief that some pilots form a type of confidence that they have some measure of control when in fact they have very little. Their illusory superiority (and optimism bias) is supported by hanger talk, the validation from others, paper qualifications, previous competency and other external influences that serves to inflate an individual’s personal ego. Basically, just because we’re qualified at something doesn’t necessarily mean that we’re any good at it (despite what our mother keeps telling us). This means we need checklists and SOP’s, and we need visualisation or briefings and procedures to mitigate the potential for a screw up! Also, just because we have currency doesn’t necessarily mean we have competency… there are countless sequences none of us get to physically practice in the course of normal flying. It’s only when we screw the pooch that we’re alerted to our expired skill set ? just ask any Airbus pilot.
The interpretation or construct behind “ability” is hard to measure when the individuals concerned don’t have constant oversight or evaluation (this especially applies to private pilots). An appreciation (or the illusion) of individual ability does not necessarily translate into an aircraft (we can only assume that the pilots had confidence in their ability to accurately fly with aircraft). Pilots need to create a real disconnect that separates what they would like to be true with what is actually true (egocentrism is an overtly self-serving bias placing importance and significance on a pilots ability rather than his or her weaknesses). There are interesting cases of heavy jet pilots that suffer from the same condition… usually identified as a threat in preflight.
In a multi crew aircraft, we’re taught to disengage from what our ego is telling us and connect more with SOP’s, procedures and modal awareness. Although it’s possible for both pilots to form a confirmation bias relating to the same error or judgement, we have more procedures and training in place to mitigate the potential for those types of obvious errors… although it does happen. Global configuration occurrences occur almost daily – with higher numbers estimated than is known due to lack of certain reporting cultures. A few Aussie examples of situationally unaware pilots are described below.
ATSB Investigation Number: AO-2009-066
On October 26, 2010, a Boeing 767 operated by Qantas Airways conducted a go-around due to an “incorrectly configured” aircraft (read: the gear wasn’t down). The ATSB report states that “Approaching 500 ft on approach into Sydney, the crew commenced a missed approach due to the aircraft being incorrectly configured for landing. During the commencement of the missed approach the ‘too low gear’ GPWS warning activated”. The ATSB concluded, not surprisingly, that “the incorrect aircraft configuration was the result of several interruptions and distractions during the approach. These interruptions and distractions resulted in a breakdown in the pilots’ situational awareness.”
ATSB Investigation Number: AO-2011-089
On the 28th July, 2011, A Jetstar A320 was passing 245 feet on arrival into Melbourne when the crew received a ‘TOO LOW FLAP’ aural and visual warning from the aircraft’s enhanced ground proximity warning system (EGPWS). On this occasion, an unstablised approach led to a (tunnel vision type) situation where flaps weren’t configured for landing and the landing checklist wasn’t completed. This case highlights countless other issues of competency arising from a 300 hour co-pilot that was essentially “under training” tasked with an inexperienced captain (lending itself to the argument that supports a minimum of 1500 hours of ‘real’ experience for high capacity RPT operations… but that’s another story!). The co-pilot was so overwhelmed and disconnected from what was going on in this situation that he failed to raise flaps after the go-around. This ATSB report reads like a slapstick comedy.
ATSB Investigation Number: AO-2010-035
Only a few days ago, the ATSB published final findings into a Jetstar A320 bound for Changi International Airport on the 27th May, 2010. The investigation identified several events on the flight deck during the approach that distracted the crew to the point where their situation awareness was lost, decision making was affected and inter-crew communication degraded. The aircraft descended below 500 feet with the gear still in transit. Again, a GPWS alert was issued. One of the more disturbing aspects of this incident is that the primary ‘distraction’ on the flight deck consisted of the captain reading text messages on his mobile phone. The statement on the Jetstar website serves as an indicment against their culture. It reads, in part, “The ATSB report made no findings against Jetstar, nor did it find any fault with Jetstar’s policies or procedures. The safety of the aircraft was never compromised.” What Rubbish…
In all the cases described above, the approaches were reported as unstabilised leading to a loss of situational awareness. In all cases, even when the crew acknowledged their error, the aircraft – the last line of automated defence – alerted them to their oversight. We generally don’t have this level of redundancy in light aircraft.Although the gear-up landing was the ultimate result in both of the above videos, the opportunity to correct the approach should have occurred far earlier. Landing checklists, flows, flying school SOP’s and stabilized criteria through certain “gates” (that specify certain performance criteria on, say, downwind, base, final and “over the fence”) are taught at most schools to mitigate the risk of such an occurrence. The last line of defence against such errors taking place relies upon the pilot to simply have a good understanding of the aircraft they’re flying. That is, an appreciation of what is “normal” for most conditions of flight.
In my former instructional days, nothing was more important than seeing the pilot under training have an understanding of what the aircraft was doing and why it was doing it (otherwise, we’re nothing more than passengers at the controls). If they ever found themselves in a position where they didn’t meet certain conditions at various gates throughout any phase of flight, I wanted them to ask one question – “why?“. Acknowledge the problem and identify the cause… then implement a solution. An appropriate solution or response won’t be reached until we know exactly what it is we’re meant to fix. If all else fails and the aircraft doesn’t respond as it should… go around.
Another question every pilot should ask themselves is “what should I expect?“. It’s unacceptable for a pilot to become aware of a strong tailwind on base when wind information was provided in an ATIS weather report or from forecast conditions. An arsenal of pre-circuit knowledge will alert the crew to planned power settings and the expected configuration. Optimistic bias is caused by cognitive mechanisms that guide judgments and decision-making processes – meaning, simply, that the more we know, the less likely we are to make an error. This applies to reactive decisions we make in the air but it also means that a well-read, briefed or educated pilot is simply better equipped to make superior decisions before they even leave the ground. In the situations above, a simple pre-circuit briefing would have armed the pilots with sufficient knowledge to fly the approach proactively; meaning that they likely wouldn’t have become high in the first place.
Any instruction from a flying school that encourages a continued unstablized approach in any condition other than a non-normal situation is essentially validating unsafe flying. It’s negative learning from a student’s perspective and negligence on the part of the teacher.
Professional Pilots Screw Up Too – Fixation
There are only a few accidents more infamous than Eastern Air Lines Flight 401 and its controlled flight into terrain. The entire crew of four became so fixated on repairing a green landing light indication that they weren’t aware that the aircraft disengaged from the autopilot pitch mode into what’s known as Control Wheel Steering, or CWS – a relatively new technology at the time. CWS will essentially hold whatever attitude the pilot selects (think of it as a ‘trimming mode’ after manually moving the controls out of autopilot). The aircraft descended without the knowledge of the crew while they attended to the light… and the fully serviceable aircraft crashed into the Florida Everglades causing 101 fatalities.
Professional Pilots Ignore Audible Warnings Too
On February 19, 1989, a Boeing 747-200 operating as Flying Tiger 66 was operating an international cargo flight into Kuala Lumpur. Conducting an unfamiliar NDB approach at night, the crew flew what can only be described as unstabilised approach – due in part to their complete lack of situational awareness. They crashed into terrain 12 miles from the airport. Numerous warnings were provided by the GPWS (Ground Proximity Warning System) and all were ignored by the flight crew (not unlike the Cessna’s warning horn). All four crew were killed.
It’s argued that the crew ignored the GPWS as a result of the new (at the time) technology providing erroneous callouts. In any case -and despite what was clearly a rushed approach – no crewmember even acknowledged the repeated calls. Not unlike the two gear-up cases, it’s plausible that the crew didn’t acknowledge the calls by virtue of their fixation on the approach that they were incorrectly flying. Although it’s important to watch the entire video to fully understand the circumstances surrounding the confusion, the GPWS calls are made at about the 8:50 mark.
An obvious statement: The more you look out and are trained to identify an anomoly or error, the more obvious it becomes. Situational awareness, SOP’s, procedures, training, and knowledge and understanding of your aircraft and the conditions and/or performance of your aircraft and operation are paramount. You don’t have to be a paid professional pilot to be a professional aviator.
- NTSB report for Cessna Gear Up Landing.
- NTSB report: Eastern Airlines L-1011, N310EA – PDF
- Flight Safety Foundation ALAR Briefing Notes
Shortt URL for this post: