The perils of automation – Aviation, Tesla and the Australian Open
The Australian Open Tennis Championship has fallen for the same automation trap that that the aviation industry has been suffering for the last ten years.
The problem is with technical human-machine systems that are not resilient.
The problem boils down to this – the more you automate a process, the less operators retain their focus on, and intuitive skills for, that process.
Technology and Security are enemies
Automation must only ever be a tool to assist the human – never the only option to take over. Ignore this rule at your peril.
The terms “High-technology” and “security” are oxymorons. That’s because Technology and Security are enemies of each other.
In his book “Beyond Fear”, world data expert Bruce Schneier wrote (page 9) about technology:
Technology is an enabler. Security the opposite. It tries to prevent something from happening.. That’s why technology doesn’t work in security the way it does elsewhere, and why overreliance on technology often leads to bad security.
Whilst technology is the enabler, security is the technology brake that is always trying to stop things from happening.
That’s why technology can’t synergise with security the way it does in other spheres (accuracy, cost, speed, time). And that’s why over-reliance on technology often leads to bad security and increased risks.
Humans must learn to survive without automation or when the automation fails. When the technology doesn’t do what we want, we must be confident to turn the technology off and either continue the task manually, or stop the task.
Aviation – Auto Thrust
Pilots must never rely on automation.
For example, auto thrust systems in aircraft are simple systems, designed to support the pilot but never outvote or overpower them.
In Boeing and Airbus aircraft, the auto thrust systems can be overpowered or disabled simply by moving the thrust levers. When the auto thrust system fails, pilots simply need to move the thrust levers by hand, in the same way that a driver of a car presses on the accelerator pedal.
Practice makes habits. The more you automate a process, the less operators retain their focus on, and intuitive skills for, that process.
Yet many fatal B777 and B737 accidents over the past few years have occurred because the pilots trust the computers too much, don’t understand how they work, and are sceptical or underconfident to disconnect them or override them using brute force. This overconfidence in automation and underconfidence to disconnect it started to appear in aviation in the early 2000s.
Curiously, many Boeing pilots who criticised Airbus auto thrust systems, failed to appreciate the underpinning logic and Human Factors (ergonomics) of the Airbus design, that always requires the pilots to manually advance the thrust levers to commence a take off or to go-around. That first instinctive (hippocampal) physical forward movement triggers the auto thrust system to engage and assist the pilot. In the event that the Airbus auto thrust fails, the engine thrust setting reverts to the selected thrust lever angle that will (unless in the case of engine failures) always produce sufficient thrust.
Furthermore, some Boeing pilots rely too much on automation. Some just press buttons to command the auto thrust to advance the thrust levers for a take off or go-around procedure rather than pressing the buttons to select the desired automation mode, then physically pushing the thrust levers forward to ensure the thrust actually does increase. These pilots trust the auto thust to physically advance the thrust levers for them – a trust that has in one recent event, ended in misery. Putting one hundred percent trust in the auto thrust can have fatal consequences when the auto thrust is operating in a mode unexpected by the pilot, or not understood by the pilot or when the auto thrust fails.
This problem is due to a lack of Knowledge, Training and Experience – three of the elements of resilience that I detailed in FLY!.
This problem must be fixed.
Note: In my book QF32 at page 257, I documented my problems with the autopilot that disconncted many times whilst I flew the final approach to land. When it became more of a nuisance that a help, I kept the autopilot disconnected and flew the rest of the approach manually.
https://www.airlineratings.com/news/maintenance-pilots-focus-indonesian-737-crash-investigators/
Tesla – Autopilot
Drivers must never rely on automation.
Tesla’s autopilot and other automotive autopilots are not replacements for the sentient human mind. In SAE’s scale of “Levels of Driver Automation”, a scale from zero to five, Tesla”s autopilot is ranked at just level 2. (Current Boeing and Airbus aircraft would also satisfy just Level 2)

Any autopilot is only as good as its extremely limited senses, computers and actuators.
Humans have four million input nerves to the brain (senses), 86 billion neurons, 100 trillion synapses (computer), and 650 skeletal muscles to move controls – vastly more than any current autopilot. Telsa automated cars are only resilient because the human must take over control when the simple autopilot fails.
See also: https://landline.media/study-the-more-you-automate-driving-the-less-drivers-focus/
Tennis Australia – LET Sensor
Tennis Australia (AO) has also fallen into the automation trap.
The Australian Open tennis courts have new sensors to detect the ball bouncing outside the lines and to detect LETs during the serve.
Unfortunately the LET sensor has no redundancy when it behaves spuriously as it was suggested to behave last night. Nick Kyrgios believing the sensors to be incorrect, begged the umpire to “turn the machine off” then stated “I’m not playing until you turn it off”.
Whether the sensors were correct or not is not the issue. The issue is that the sensors were suspected to be faulty and there were no procedures to validate the sensor, or to fix or mitigate a failure. (There were no video cameras to check the sensor’s accuracy, and AO appeared to have no procedures to refer to when the sensor was suspected faulty.)
A risk assessment for the new technology should have resulted in procedures being developed for the umpire to validate the sensor, or turn the sensor off and manually take over – something as simple as assigning a person to listen to and feel the net.

Lessons
There are three key lessons in these three cases.
- Firstly, as the number and complexity of machines increases in our lives, the designers must ensure that there are always procedures for humans to shut down machines, take over manually and survive. The only constraint for this disconnect switch is that it should not become in itself, a single point for a critical failure. (see note below)
- For pilots, Tesla car drivers and indeed every person using complex machinery, it’s critical that humans have the Knowledge, Training and Experience to operate the machine in the manner that it was intended to be operated. In the event that the automation is not performing as expected, operators must also have the confidence to, and the knowledge how to, disconnect the automation and resume manual control.
- Practice makes habits. The more you automate a process, the less operators retain their focus on, and intuitive skills for, that process.

Coral and I wish you a safe, healthy and happy COVID recovery. We wish you the best to FLY!
Note: Today’s Airbus and Boeing aircraft contain multiple flight control computers that host the Fly By Wire flight control system logic. Both manufacturers have different methods to degrade the control laws from Normal Law (when everything is working) to Direct Law (when dependent systems fail). Boeing provides one switch to command Direct Law. Airbus requires at least two switches (in many optional combinations) to be turned off to effect a change to Direct Law. Which method is best? Let me know if you are interested to discuss.
Updated 16 Feb 2021