Tesla and the risk of technological complacency

The recall of the Tesla Autopilot software is an illustration of the risks of human complacency that can occur as more and more of the tasks we’re used to become automated.

Years ago, people learning to drive had to master the skill of hill start, where they used their left foot to ride the clutch to its biting point in order to hold the car on a hill and ensure it didn’t roll backwards. It was part of the UK driving test. But these days, modern cars have mechanisms and electronics that automatically prevent them from rolling backwards when they are stopped on a hill. Anti-lock brakes and traction control have all added to car safety, by preventing brakes locking up and uncontrollable skidding.

The driverless tech seen in Teslas is similar to the lane assist and adaptive cruise control functions available in many modern cars. They are all designed primarily as safety features, to avoid collisions during motorway driving. Some cars feature a version for reversing, which applies the brakes if the system detects a rear shunt is about to happen.

All of these things represent progress and aim to offer drivers a safer driving experience. But they also risk making us lazy. One can imagine an automaker exec presenting autopilot-like systems as an enabler for multi-tasking while driving.

The US National Highway Traffic Safety Administration (NHTSA) found that distracted driving was the cause of 3,522 deaths and 362,415 injuries in 2021. It has now deemed Tesla’s Autopilot as unsafe. According to a news report on Reuters, NHTSA found that Tesla’s Autopilot “can provide inadequate driver engagement and usage controls that can lead to foreseeable misuse.”

In effect, drivers using Autopilot are less likely to focus on actually driving their cars. Did the developers coding the system actually think about this shortcoming? Possibly, but the luxury that often cocoons a driver in a high-spec Tesla, is likely to be something the carmaker will want to make better and better. They will want to make being in a Tesla a fabulous experience. But this is not the same as driver experience.

Tesla will have to update all its US cars with an update to the software to try to make drivers using Autopilot more engaged in actually driving. Other carmakers are likely to look very closely at what Tesla does, to avoid NHTSA scrutiny.

But Tesla’s woes have wider implications on the use of AI and automation for enhancing day-to-day life and work. Is AI rewriting what we value as humans? As we get closer to the end of 2023, we should take some time to reflect on this.

Data Center
Data Management