A few days ago I wrote about a pair of crashes, one involving an autonomous Uber vehicle that struck and killed a pedestrian, and another involving a Tesla Model X that apparently swerved into a lane divider killing the driver. Yesterday an SF Bay Area TV station, ABC 7 news, offered further reporting on this general issue, suggesting the Tesla Motors autopilot system has been swerving cars into lane dividers for months, and that Tesla’s Engineering team should have known about this issue in the autopilot system.
For the report, ABC7 reporter Dan Noyes talked with experts and other Tesla drivers. He learned of an accident occurring in the Hayward Area in a similar arrangement – a car that drove itself into a barrier, but fortunately in that case the collision attenuator was intact and the driver walked away without injury.
ABC7’s report identified both instances shared common attribute, that the car was being driven into the sun, making lane markings difficult to see. The Tesla Autopilot system relies on information processed from video streams as part of detecting lane location. Clearly periods of low visibility can give bad quality video that gives incomplete data to the Autopilot system resulting in problems.
The ABC7 article reiterates an important point – Tesla Motors sees its Autopilot feature as Beta quality. It requires that the driver stay engaged with the driving process at all times. A driver that is not engaged – like that UBER driver who was obviously fiddling with his phone – would be unable to take over in case the Autopilot is making a bad choice.
Taking this a step higher, it means all Tesla drivers are Beta testers. In the software development world “Beta” is a phase where you know there are bugs, but you give the software to a small sample of your customer base so the customers can help with readying the software for commercial release. Tesla is therefore abusing this phase of software development, and releasing Beta-quality (or worse) software to their entire customer base.
An important question is whether Tesla made it adequately clear to the customers what Beta means?
Let’s close this with a few videos about Tesla Autopilot
That video shows several instances of driving into the sun, where the autopilot drove the car off the road repeatedly. This test was on a quiet country road with soft grass sides, so that wasn’t a problem. It clearly shows an issue where the Tesla Autopilot is failing to properly detect lane boundaries.
This one is taken at the same location as the Model X fatality crash last month. He says the car is clearly steering itself to the left, but he intervened to avoid the crash.
This is the same YouTube user cppdan whose video was featured in my previous posting. He drove through the same location, after an Autopilot software update, and the car simply continued on through without veering into the lane divider.
A collection of instances where Tesla Autopilot prevented crashes by predicting bad traffic combinations.
A complete idiot proving his idiocy by first believing “The Tesla Drives Itself” and then eating a hamburger while sitting in a friends Tesla Model X as it drives on an intense mountain road.
- Is there enough Grid Capacity for Hydrogen Fuel Cell or Battery Electric cars? - April 23, 2023
- Is Tesla finagling to grab federal NEVI dollars for Supercharger network? - November 15, 2022
- Tesla announces the North American Charging Standard charging connector - November 11, 2022
- Lightning Motorcycles adopts Silicon battery, 5 minute charge time gives 135 miles range - November 9, 2022
- Tesla Autopilot under US Dept of Transportation scrutiny - June 13, 2022
- Spectacular CNG bus fire misrepresented as EV bus fire - April 21, 2022
- Moldova, Ukraine, Georgia, Russia, and the European Energy Crisis - December 21, 2021
- Li-Bridge leading the USA across lithium battery chasm - October 29, 2021
- USA increasing domestic lithium battery research and manufacturing - October 28, 2021
- Electrify America building USA/Canada-wide EV charging network - October 27, 2021
Pingback: Tesla Autopilot danger from “passive vigilance” effect – The Long Tail Pipe