The pilots of Asiana Airlines 214, a Boeing 777 that crashed at San Francisco International Airport in July, told investigators the auto-throttle system on their aircraft malfunctioned. They swear it was properly set prior to beginning their approach too, assuming of course that the system would adjust the engine’s power as necessary to maintain safe flight.
The problem is the auto-throttles didn’t work as expected, the airplane got too low and too slow and the pilots never noticed until it was too late.
The Boeing stalled on short final to SFO’s runway 28 Left and struck a dike near the approach end. The impact tore the aircraft into a number of pieces also killing three and injuring dozens of other passengers. The aircraft was a total write-off.
Asiana pilots on earlier flights had reported a few maintenance write-ups for the same problem — a series of “uncommanded auto-throttle disconnects” — as a potential culprit in the accident.
My question … so what?
Pilots are trained from their earliest student flight experiences to understand the idiosyncrasies of the aircraft they fly whether that’s a Cessna 172, a Beech King Air or a Boeing 777. Before a pilot is allowed to command, they’re expected to understand how each and every system aboard the aircraft functions, as well as tricks of the trade to identify when something’s amiss.
Does the mix of advanced automation make understanding what an airplane will do next more difficult than the old days? Absolutely.
Some of those system complexities, as well as our willingness to accept advanced flight training the way it’s delivered rather than with the level of depth we need to stay out of trouble, however, is one part of the problem pilots face today. Providers can’t cover every unusual situation a crew might encounter on a complex automated airplane, so they train pilots to cope with the most likely scenarios. All the training providers expect the crews they certify to at least be able to maintain control of their aircraft on a nice day unless something catastrophic occurs … and an auto-throttle failure is not a catastrophe.
The unexpected situations though? Those answers are left to the crew’s judgment, experience and knack for synthesizing what they know into an answer for the unknown. That’s a pilot’s real job description anyway.
When you look at the number of aircraft that crews have wrecked around the world in the past five years alone though, like the Asian accident, it seems pretty clear we’re giving too many pilots too much credit for their ability to fly the airplane in a pinch … or even in good weather.
Loss of control — the crew’s inability to safely fly the aircraft when the unexpected occurs — has become the biggest flying threat the industry faces.
When an onboard system fails, or begins taking the aircraft somewhere dangerous, the captain, the first officer or even someone along for the ride in the jumpseat, is expected to speak up … loudly if necessary. They’re also expected to recognize the situation early enough to maintain control of the aircraft and overcome the problem as best they can.
Seriously, isn’t that why we have humans in the cockpit … even triple redundancy computers fail, or switch systems off in a sequence no training provider could even think of demonstrating.
Perhaps too many pilots have forgotten how to fly, or maybe too many of them fail to recognize when the automation’s in charge and when it’s the human’s turn. Perhaps it’s also time to finally stop assuming that crews with lots of hours in their logbooks make the best pilots.
But if pilots are unable to recognize the dangers their aircraft’s automation might be leading them in to, or unable to convince that automation that the humans up front can do a better job when necessary, maybe we’ve already taken the first step toward creating that automation-monitor reality we fear so much.
Rob Mark, publisher