Tesla’s full self-driving beta is anything but assuring – here’s why

The software is impressive and advanced, but can still get drivers into sticky situations.
16 July 2021

Tesla’s full self-driving beta is anything but assuring. Here’s why. (Photo by NICOLAS ASFOURI / AFP)

  • The company released the beta version 9 of its Full Self-Driving system to users in its early access program.
  • It came with a warning to drivers that the software “may do the wrong thing at the worst time.”
  • Despite the system’s name, the system does not allow the car to drive itself, although there is no doubt it’s an advanced driving system. 

In 2018, Tesla CEO Elon Musk said that beta version 9 of its Full Self-Driving (FSD) system was coming. Then in 2019 he doubled down on this statement, and most recently said no later than this past June. Three years and three postponements later, at the beginning of the month, the company finally released beta version 9. However, it came with a warning.

In screenshots of the update’s release notes shared on social media, Tesla hailed improvements in driving visualization, which will display “additional surrounding information” on the in-car display. There are also improvements to the cabin camera above the rearview mirror that Tesla says will determine “driver inattentiveness” and provide them with audible alerts to remind them to keep their eyes on the road when using autopilot systems.

Musk attributed the delay to generalized self-driving being harder than he thought. He also said that the company had been working on fixing many known issues and that something could still go wrong. “Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid,” Musk said on Twitter. “Safety is always top priority at Tesla.”

FSD, which doesn’t make a Tesla car fully autonomous, currently costs a one-off US$10,000. It has all the features of Tesla’s Autopilot — which brakes, accelerates, and steers automatically — plus it allows cars to change lanes, park themselves, and recognize stop signs and traffic lights.

Beta version 9 is currently only accessible to those in the company’s early access program at this time, which Electrek reports comprises about 2,000 Tesla owners, the majority of which are Tesla employees. It is also important to note that despite the system’s name, the system does not allow the car to drive itself, although there is little doubt it is still an advanced driving system. 

In fact, it’s technically a level 2 driver-assist system, according to the Society of Automotive Engineers, which means that an automated system is doing most of the driving, steering, braking, and accelerating, but a human must be alert and ready to take over at any moment. 

The company’s autopilot systems have been under enhanced scrutiny following several casualties. Since Tesla introduced Autopilot in 2015, there have been at least 11 deaths in nine crashes in the US that involved the driver assistance system. Internationally, there have been at least another nine deaths in seven additional crashes. In April 2021, two men died in Texas after a Tesla crashed into a tree and burst into flames. Authorities believe no one was in the driver’s seat, as Tesla requires of its autonomous vehicle owners.

Such incidents have led regulators at the National Highway Traffic Safety Administration to issue a rule that requires companies to report accidents involving driver assistance or autonomous systems within one day of learning about them. Tesla actually activated its cabin camera above the rearview mirror in May, which is designed to detect “driver inattentiveness” in its Model 3 and Y vehicles when they are on Autopilot, screenshots of the release notes showed. After the latest FSD beta v9 update, the camera can provide “audible alerts” to drivers to help keep their eyes on the road, the release notes said.