Air France Flight 447 crashed into the Atlantic in 2009. Colgan Air Flight 3407 crashed the same year in Buffalo NY. In 2013, Asiana Flight 214 crashed on approach to San Francisco Airport and a few weeks later, a UPS cargo flight crashed on approach into Birmingham, Alabama. In each case, pilots were coping with problems in their automated cockpit systems at critical points in the flight and had to fall back on manual skills to fly their aircraft. And, in each, case, the outcome was tragedy.
Automation has improved by orders of magnitude since its first serious use in WWII and flying safety has improved with it. Success, however, may have diminished one problem (pilot workload) while triggering another. Accident trends indicate that pilots may now be placing too much faith in automation at the expense of their fundamental flying skills. The effect isn’t limited to aviation, either; in an era of driverless cars, automated factories, and drones, we’re all involved with issues of how human agency best fits in with increasingly intelligent technologies.
The upside . . .
The main purpose of cockpit automation is to compensate for the flaws of an imperfect pilot. After all, pilots often get tired, distracted, or overloaded. Automation doesn’t. Like any engineered system, however, cockpit automation is a product of the human mind and the technology has had its share of growing pains. Over time, new functions meant new systems added to older ones. More capability meant more things to monitor, set, or switch – a growing complexity that often increased operator workload and presented new opportunities for pilot error.
Decades of evolving engineering practice and human factors analysis have dramatically polished our design approaches. Improved operating logic and information displays made modern cockpit automation easier to understand and easier to use. “Easier,” however, doesn’t mean simple; at least one current airline cockpit has 38 automated systems, including an autothrottle with 5 operating modes.
. . . and the downside
The growing effectiveness of cockpit automation has led to an expansion in its use.Today, pilots rely on automation when they shouldn’t, a behavior that even has a name: “automation addiction.” After all, modern systems can fly a plane over every phase of its route under automated control, so why not use them? If something works, you eventually learn to trust it, especially if it does some jobs better than you do. This confidence is shared throughout the aviation establishment, too; in some cases, airline pilots are required to keep their autopilot on.
There’s a hazard in this attitude, however, that may have revealed itself in some of the air accidents of recent years. If something works consistently, you don’t pay as much attention to it, which means that you’re less likely to detect – or to respond quickly – to a problem when one finally occurs.
Airline pilot performance standards are rigorous, and flight personnel get regular training in manual flying skills. Because operational flights rely so much on automation, however, the daily demands on those skills are often nil and they still atrophy. A 2011 FAA study documented this phenomenon and its likely consequences to air safety. More specifically, another FAA study found that in two thirds of 48 “loss of control” accidents examined, pilots had trouble manually flying the plane or made mistakes with automated flight controls.
Interestingly, the problem may even grow worse as younger people, raised and comfortable with omnipresent computer technologies, move into airline cockpits. These new pilots may likely be even more eager to yield flying tasks to automation.
It’s not just for pilots anymore
The hazards of automation use are relevant to everyone, not just a select and highly-trained population like airline pilots. Automation is already woven into our lives: automatic bill pay; Google searches; DVR programming; elevators. The effect is the same – the better the automation, the weaker the original human performance that automation replaced. When does this reliance become “addiction?” Already some drivers believe their car automation more than their own senses, allowing their GPS systems to lead them down strange routes. In Australia, for example, some drivers had to be rescued from the outback by police after following GPS directions that were based on Apple maps.
So, how do we provide for the human skills required to ensure the continued viability of our automated systems? How do we best assure the human role in a technological society? We’re still navigating our way toward those answers.