Are Humans the Weakest Link in the Aviation System?

An article in the August 22nd edition of USA Today opens with the statement:1

"Automated flight controls in airline cockpits have become so reliable that safety experts say pilots could become inattentive to rare malfunctions that can lead to crashes."
The article cites a 1994 NTSB study of US commercial airline accidents between 1978 and 1990. The NTSB found that failures in monitoring and and challenging were "pervasive", occuring in 31 of the 37 occurrences.2 "Challenging" involves the non-flying pilot challenging the decision of the flying pilot. In most instances, the failure to challenge was on the part of the First Officer who did not challenge the Captain's decision(s). A similar study has not been repeated more recently by the NTSB, but inadequate monitoring continues to be a factor in more recent accidents.1

It is reasonable to assume that a similar "failure to challenge" would occur with automated flight control systems: an assumption that the automated system never fails, so the action that it is taking must be right. Indeed, one pilot quoted by USA Today states "What happens when we see something work correctly 99 times? What do we do on that 100th time? Are we monitoring it with the same level? The answer is no." Having been conditioned to a system working correctly, the human brain is going to assume that the system will work. This is known in the literature as "automation bias" - the assumption that the automated system is correct. Automation bias leads to "complacency" - reduced monitoring and checking of the automated solution.3

The USA Today article concludes with comments from Air Line Pilots Association's director of human factors, Helena Reidemar: "The brain is not wired to reliably monitor instruments that rarely fail. We're not robots. We can't just sit there and stare at the instruments for hours on end."

This phenomena is well recognised. Parasuraman (1987) notes that the "results of laboratory and simulation studies suggest that vigilance effects can limit performance in complex monitoring tasks" and that "performance deficits may occur because of either vigilance decrement over time or sustained low levels of vigilance".4

Vigilance effects can arise from two opposing causes: boredom and the high workload associated with high levels of vigilance.5 These are exactly the issues arising in pilots monitoring near-perfect systems: vigiliance decreases over time because of at least three separate factors: (1) the human brain learns that the system will not fail; (2) potential boredom when nothing happens; and (3) the workload associated with sustaining high levels of vigilance causes fatigue, leading to decreased levels of vigilance. Parasurman concludes that "in assessing the impact of increased automation the beneficial effects on mental workload have to be traded off against possible adverse effects on vigilance".

Removing the human from direct participation in the system under control also leads to reduced performance. Molloy and Parasuraman (1996) show that "monitoring for a single failure of automation control was poorer than when participants monitored engine malfunctions under manual control."6

This begs the question: if the failure rate of systems is so low that the ability of humans to effectively monitor those systems is compromised, is it safer to remove humans from those systems? Rather than Unmanned Aerial Vehicles (UAVs) being riskier than piloted aircraft, are we approaching the point in time where UAVs will be the safer option? For the time being, a remotely-piloted UAV may bring the benefits of increased operator vigilance, at least while the remote pilot has control over the aircraft. But for long-range (semi-) autonomous UAVs monitored from afar, the same issues of pilot monitoring arise as exists for manned commercial aircraft.7

Perhaps robots are the best way to monitor the systems in modern aircraft and detect malfunctions, whether those aircraft are technically manned, remotely piloted, or unmanned. Malfunctions in the monitoring systems could be overcome by multiple parallel monitoring systems, in the same way that a piloted aircraft may have multiple pilots. This system of redundancy was developed for the space shuttle some four decades ago,8 with no fatal incidents ever attributed to the flight control systems.

[1] Bart Jansen, "Pilots' focus in the cockpit under scrutiny", USA Today, 22 August 2013. Available online at

[2] National Transportation Safety Board, A Review of Flightcrew-Involved, Major Accidents of U.S. Air Carriers, 1978 through 1990, Safety Study NTSB/SS-94/01, January 1994. Available online at

[3] For a comprehensive review of Automation Bias see M.L. Cummings, "Automation Bias in Intelligent Time Critical DecisionSupport Systems", AIAA 1st Intelligent Systems Technical Conference, American Institute of Aeronautics and Astronautics, September 2004. Available online from the American Institute of Aeronautics and Astronautics and direct download from the Massachueste Institue of Technology.

[4] Raja Parasuraman, "Human-Computer Monitoring", Human Factors: The Journal of the Human Factors and Ergonomics Society, December 1987 vol. 29 no. 6 695-706. Available online at

[5] Joel S. Warm, Raja Parasuraman, and Gerald Matthews, "Vigilance Requires Hard Mental Work and Is Stressful", Human Factors: The Journal of the Human Factors and Ergonomics Society, June 2008 vol. 50 no. 3 433-441. Available online at

[6] Robert Molloy and Raja Parasuraman, "Monitoring an Automated System for a Single Failure: Vigilance and Task Complexity Effects", Human Factors: The Journal of the Human Factors and Ergonomics Society, June 1996 vol. 38 no. 2 311-322. Available online at

[7] M.L. Cummings, C. Mastracchio, K.M. Thornburg, and A. Mkrtchyan, "Boredom and Distraction in Multiple Unmanned Vehicle Supervisory Control", Interacting with Computers first published online February 6, 2013 doi:10.1093/iwc/iws011. Available online at

[8] J.R. Slarkoff "Redundancy Management Technique for Space Shuttle Computers" IBM Journal of Research and Development, January 1976:20-28, Available online at