The truth about self-driving: Are autonomous vehicles really safe?

Motorists and experts in the US and China worry driver assist functions are being oversold after several fatal incidents.
2 September 2021

Self-driving vehicle with low-cost cameras? Toyota and Tesla are on it (Photo by NICOLAS ASFOURI / AFP)

  • The US federal agency in charge of road safety is opening an official investigation into Tesla’s “self-driving” Autopilot system
  • China’s Ministry of Industry and Information Technology also just issued new regulations, governing vehicles with driver-assist functions
  • There is no common definition of what autopilot means and that is a problem, experts reckon

There is a similar controversy gaining attention around the world — numerous automakers are presenting their autonomous driving tech as being capable of full self-driving. The reality, however, is starkly different and more nuanced than the industry openly admits to. Experts reckon the premature obsession with autonomous vehicles is having a detrimental impact on one of its main selling points: road safety. 

True enough, last month, the US federal agency in charge of road safety opened an official investigation into Tesla’s “self-driving” autopilot system. The investigation will cover roughly 765,000 Tesla cars made since 2014. That includes those in the Model Y, Model X, Model S and Model 3 — the entire current range of Teslas — according to the National Highway Traffic Safety Administration (NHTSA).

The NHTSA said it was acting following 11 Tesla crashes since 2018 involving emergency vehicles, and in some cases, the Tesla vehicles “crashed directly into the vehicles of first responders”, it said.

The US is definitely not alone in this. Just days following NHTSA’s decision, China’s Ministry of Industry and Information Technology issued new regulations, governing vehicles with ‘driver-assist’ functions. While heavily focused on cybersecurity, the rules also require manufacturers to clearly inform buyers of a vehicle’s capabilities and limitations, and the responsibility of drivers. 

To top it off, technological measures should also be adopted to make sure drivers do not take their hands off steering wheels, the regulations state. A draft of the new rules was published in April. But beyond the regulations, experts and auto industry players in China are calling for standardized self-driving claims.

Recently, Nikkei Asia quoted the founder of Chinese EV startup Li Auto, Li Xiang calling on the auto industry and the media to start using a set of standardized and straightforward terms — instead of industry jargon — to describe the vehicles’ automation ability, to avoid risky (and yes, even fatal) misunderstandings. Many reckon the imprecise, sometimes misleading, claims made by some car manufacturers and their salespeople are inducing many to believe it is safe to let the machine steer the vehicle without the human needing to pay attention, when even the most advanced passenger car currently on the market is nowhere near to achieving that level of autonomy.

When can we really categorize a vehicle as self-driving?

China adopts a similar 0-to-5 system as the US whereby the levels of vehicle automation is measure from 0 to 5, which would put autopilot in the category of level 2 — meaning it works only in very limited conditions, and drivers are still obligated to supervise and retake control quickly. 

Often referred to as “autopilot” or some variation of this, the driver-assist mode has been made available on most cars launched by Chinese electric vehicle startups. In addition to automatically controlling the speed of the vehicle, the function also allows the car to steer, accelerate, and brake within its own lanes. 

However, some brands and social media influencers often confuse this mode with the more advanced “autonomous driving”, which completely frees the motorist from driving. To make matters worse, recently in China, the death of a 31-year-old entrepreneur while driving a NIO SUV caused an outcry because the accident occurred while the car’s Navigate On Pilot (NOP) autopilot function was turned on.

Thousands of NIO owners believe the company has not provided enough clear information to customers about the NOP function and its limitations. The incident has sparked a fierce online debate about the safety of NIO’s driver-assistant system, and could possibly create a stumbling block for the meteoric rise of Chinese electric vehicle start-ups that are emerging as major beneficiaries of Beijing’s efforts to achieve carbon neutrality. 

Perhaps someday, Tesla and other Level 2 vehicles will graduate to higher grades of safe automation, but as of now, it seems that we’re still a-ways away from that day. In fact back in March last year, a survey from the American Automobile Association found that some 90% of Americans are dubious about self-driving cars. Another similar survey came to the same conclusion: Americans either don’t understand or don’t trust vehicle autonomy.

This uncertainty likely stems from a severe lack of awareness and murky education on the issues surrounding these vehicles. Perhaps, automakers should stop ‘autonowashing’ — sharing misleading information regarding self-driving cars to oversell the capability and make them appear more technologically advanced than they presently are. 

Even Tesla’s ‘autopilot’ and ‘full self-driving’ features in its electric cars, for example, are neither of these two things. As US Senators Richard Blumenthal and Edward Markey put it; “Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road.” Yet, because of branded terminology rather than literal descriptions, there seems to be little to stop Tesla from using such claims.