Technology 4 min read

Is Tesla's Autopilot Getting Worse?

For transparency’s sake, and as a strategy against “unfair” media scrutiny, Tesla has been publishing quarterly reports to show how safe its autopilot is. But how safe is it really?

These new numbers show that although Tesla's autopilot may be safer than human drivers, it may not be safe enough just yet. ¦ Image via Tesla

These new numbers show that although Tesla's autopilot may be safer than human drivers, it may not be safe enough just yet. ¦ Image via Tesla

While the media generally report the good and the bad about Tesla, CEO Elon Musk thinks his company is unfairly scrutinized.

After a self-driving Tesla crashed a number of months ago, Musk tweeted.

“It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage,”

Autopilot Safety Reports: Tesla’s Move Against Bad Publicity

The biggest card Tesla and self-driving car enthusiasts have for their argument is to put side by side statistics of human-driven and autopilot crashes.

This is also the approach Tesla is following in its Autopilot Safety Reports.

To showcase the superiority of its models over average cars, the electric carmaker has been releasing quarterly safety data reports starting with last year’s third quarter.

For Q3 2018, Tesla registered one accident or crash-like event for every 3.34 million miles driven while autopilot engaged, compared to one per 1.92 million miles when humans took the wheel.

“By comparison, the NHTSA’s most recent data shows that in the United States, there is an automobile crash every 492,000 miles.”

For Q4 2018, autopilot accident rate went up with one event for every 2.91 million miles, and the same for human-driven crashes with one accident per 1.58 million miles driven. And again, it puts the NHTSA’s numbers, this time actualized to one auto crash every 436,000 miles driven.

This week, Tesla released the Q1 2019 autopilot safety report.

“In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot, we registered one accident for every 1.76 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 436,000 miles,”

Read More: Tesla To Demonstrate Full Self-Driving Capabilities This Month

Between Safety Tests and Safety Numbers

AI-powered driver-assist technology has been making big progress, and one way to track this progress is via tallying and speaking about autopilot-related crashes.

If anything, Tesla own numbers don’t place its autopilot system in good light. Or else, how would we interpret the increasing autopilot crashes quarter to quarter?

The same also goes for cars with humans at the wheel, but humans are humans, whereas an autopilot is supposed to get better over time — or at least not get worse.

Per Tesla, you’re the least likely to get injured inside a Model 3 sedan than when inside any other vehicle ever tested by the U.S. National Highway Traffic Safety Administration (NHTSA).

The Model S and Model X sport utility vehicle take the other two top spots for the lowest probability of causing injury.

Although the NHTSA doesn’t actually rank brands and models within the five-star category, Tesla cars earn many points for safety and performance.

But even a perfect safety 5-star safety rating doesn’t ensure a model is crash-proof.

In the case of Tesla, which has announced the release of its new compact SUV Model Y next year, any autopilot mishap means a field day for driverless car opponents.

Safety tests say that you’re safer in a Tesla with the autopilot on than in any other car, driverless or not. But safety numbers show that you’re not as safe as you think you would be.

That being said, autopilot crash reports is a laudable move and other carmakers should perhaps follow Tesla’s initiative toward making it a standard metric. It will be a long battle to finally give autonomous programs the power to take the wheel from human drivers, but transparency reports are certainly a great way of pushing this movement forward.

Read More: Future Self-Driving Cars Won’t Let Drunk Drivers Take Control

Found this article interesting?

Let Zayan Guedim know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Zayan Guedim

Trilingual poet, investigative journalist, and novelist. Zed loves tackling the big existential questions and all-things quantum.

Comments (2)
Most Recent most recent
You
  1. Profile Image
    John Usrey April 16 at 6:55 am GMT

    Who becomes liable for accidents and/or faults for self driving cars?

    • Profile Image
      Claire Smith April 19 at 8:30 am GMT

      The right car insurance can help protect you.

6
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.