Self-driving vehicles need to crawl before they walk, and walk before they run

A few days ago I wrote about a pair of crashes, one involving an autonomous Uber vehicle that struck and killed a pedestrian, and another involving a Tesla Model X that apparently swerved into a lane divider killing the driver.  Yesterday an SF Bay Area TV station, ABC 7 news, offered further reporting on this general issue, suggesting the Tesla Motors autopilot system has been swerving cars into lane dividers for months, and that Tesla’s Engineering team should have known about this issue in the autopilot system.

For the report, ABC7 reporter Dan Noyes talked with experts and other Tesla drivers.  He learned of an accident occurring in the Hayward Area in a similar arrangement – a car that drove itself into a barrier, but fortunately in that case the collision attenuator was intact and the driver walked away without injury.

ABC7’s report identified both instances shared common attribute, that the car was being driven into the sun, making lane markings difficult to see.  The Tesla Autopilot system relies on information processed from video streams as part of detecting lane location.  Clearly periods of low visibility can give bad quality video that gives incomplete data to the Autopilot system resulting in problems.

The ABC7 article reiterates an important point – Tesla Motors sees its Autopilot feature as Beta quality.  It requires that the driver stay engaged with the driving process at all times.  A driver that is not engaged – like that UBER driver who was obviously fiddling with his phone – would be unable to take over in case the Autopilot is making a bad choice.

J1772 extension cords

Taking this a step higher, it means all Tesla drivers are Beta testers.  In the software development world “Beta” is a phase where you know there are bugs, but you give the software to a small sample of your customer base so the customers can help with readying the software for commercial release.  Tesla is therefore abusing this phase of software development, and releasing Beta-quality (or worse) software to their entire customer base.

An important question is whether Tesla made it adequately clear to the customers what Beta means?

Let’s close this with a few videos about Tesla Autopilot

That video shows several instances of driving into the sun, where the autopilot drove the car off the road repeatedly.  This test was on a quiet country road with soft grass sides, so that wasn’t a problem.  It clearly shows an issue where the Tesla Autopilot is failing to properly detect lane boundaries.

J1772 extension cords Tesla J1772 adapters Open the door to the Tesla Destination Charger network using these Tesla-J1772 adapters

Sponsored

This one is taken at the same location as the Model X fatality crash last month.  He says the car is clearly steering itself to the left, but he intervened to avoid the crash.

This is the same YouTube user cppdan whose video was featured in my previous posting.  He drove through the same location, after an Autopilot software update, and the car simply continued on through without veering into the lane divider.

A collection of instances where Tesla Autopilot prevented crashes by predicting bad traffic combinations.

A complete idiot proving his idiocy by first believing “The Tesla Drives Itself” and then eating a hamburger while sitting in a friends Tesla Model X as it drives on an intense mountain road.

J1772 extension cords

 

 

 

 

 

 

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

One Comment

  1. Pingback: Tesla Autopilot danger from “passive vigilance” effect – The Long Tail Pipe

Leave a Reply