Misplaced trust in driver assist (Tesla Autopilot) systems can cause huge problems

Tesla may be rushing too quickly with the autopilot feature, or else not doing a good-enough job of educating Tesla car owners on safe use of Tesla’s autopilot.  In light of several crashes while autopilot was engaged, and reports by Tesla car owners of several modes where autopilot clearly does the wrong thing, we have to take Tesla Autopilot with a grain of salt.  Not only is it clear from field reports that Tesla’s autopilot can be easily confused, testing by Thatcham Research clearly demonstrates this to be real.

Electric vehicle charging station guide

In a video by Thatcham Research staffers, we are told repeatedly that Tesla Autopilot is an “assisted driving” system designed to “help keep you safe, help support the driver, but they don’t do the driving for you, you are still in control.”

The scenario shown in Thatcham Research’s video is a multi-lane highway, the Tesla is following another car, and that car suddenly changes lanes revealing a stopped car.  That’s a frequent problem, isn’t it?  A stalled car stuck in the middle of a highway getting hit from behind by another car.  In the video, the Tesla is unable to detect that stalled car because of the intervening car, and instead the Tesla collides with the stalled car at pretty much full speed.

Fortunately the stalled car in this case is simply a blow-up air-filled mock car, and the collision does not cause any damage to anyone.  But this scenario could easily happen on the highway for real.

Will over-confident car owners do crazy things?

This video states my concern very well.  Are owners of these cars over-confident in the technology?  Are the manufacturers overselling the driver assist features?  Are owners believing a false story that these cars have autonomous driving, when its really a driver assist system?  Will over-confident owners do crazy things?

For Tesla’s part, according to Jalopnik the NHTSA report clearing Tesla of responsibility in the 2016 fatality crash noted that Tesla had weighed the likelihood their customers would be idiots with the Autopilot feature.  As a result, Tesla designed in all kinds of warnings and systems to detect driver abuse of Autopilot.  Specifically, quoting from the report:

It appears that over the course of researching and developing Autopilot, Tesla considered the possibility that drivers could misuse the system in a variety of ways, including those identified above – i.e., through mode confusion, distracted driving, and use of the system outside preferred environments and conditions. Included in the types of driver distraction that Tesla engineers considered are that a driver might fail to pay attention, fall asleep, or become incapactitated [sic] while using Autopilot. The potential for driver misuse was evaluated as part of Tesla’s design process and solutions were tested, validated, and incorporated into the wide release of the product. It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse.

The warnings haven’t stopped owners of Tesla cars from seeing how far they can go in letting Autopilot do the driving.

Just today, the US Dept of Transportation issued a cease-and-desist order to stop sales of a product that let Tesla car owners defeat the devices that detect driver abuse of Autopilot.  The device, billed as a “Tesla Autopilot Nag Reduction Device,” is a magnet that defeats the sensor which detects whether the drivers hands are on the steering wheel.

Electric vehicle charging station guide

“A product intended to circumvent motor vehicle safety and driver attentiveness is unacceptable,” NHTSA Deputy Administrator Heidi King said in a statement. “By preventing the safety system from warning the driver to return hands to the wheel, this product disables an important safeguard, and could put customers and other road users at risk.”

And, it shows what lengths people will go to prove their idiocy.  That may seem harsh, but if someone dies while proving they can override autopilot safety systems, they’re an idiot.

Other demonstrated risks

In my previous reports on this topic, I’ve found videos from individual car owners demonstrating some problems:

For example in May a Tesla Model X owner died when his car drove directly into the tip of the gore point where an off-ramp split off of US 101 in Mountain View.  Several Tesla car owners have demonstrated that Autopilot tends to drive the car into the gore point in similar circumstances.

J1772 extension cords Tesla J1772 adapters Open the door to the Tesla Destination Charger network using these Tesla-J1772 adapters

Sponsored

In the Uber case — a supposedly fully autonomous car was driving a test pattern, and failed to recognize a pedestrian crossing the street.  Instead it made several misidentifications until it was too late, and the car slammed into the pedestrian who died in the hospital shortly afterward.  The driver was being inattentive, as is clearly seen in dashcam video, or presumably the driver would have seen the pedestrian even if the computer did not understand what it was seeing.  A person is dead because of those failures.

Recent crashes

In May, a Tesla Model S in Salt Lake City collided with a fire truck stopped in the road.  The driver had not touched the steering wheel for 80 seconds before the collision, told police she believed the car would detect other vehicles and safely stop, and that she was not paying attention to the road but was using the map feature on her cell phone to compare routes.  It appears her car was following another car at 55 miles/hr.  When that car changed lanes, the Tesla Model S sped up seemingly to return to its 60 miles/hr set point, and it failed to detect the stopped fire truck, failed to slow down, and hit the fire truck at full speed.

In other words, this is exactly the situation demonstrated by Thatcham Research’s video.

In May, a Tesla Model S in Laguna Beach CA collided with a parked police car.   The police car was simply parked on the side of the road, and the Tesla collided with the car.  Reports are that the Tesla was in Autopilot at the time, but nothing was said about extenuating circumstances.  FWIW the Tesla was not badly damaged, the Tesla driver sustained minor injuries, and the police car was unoccupied, but was totaled.

In January, a Tesla Model S in Culver City CA collided with a fire truck whose crew was working a motorcycle accident on the I-405 freeway.  The Tesla was on Autopilot at the time of the collision, but no extenuating circumstances were given.  Reading between the lines of the article it appears this could be another instance where the Tesla was following another car, that car swerved to avoid the fire truck, but the Tesla failed to detect its presence.  The Tesla slammed into the fire truck at full speed (65 miles/hr).

A Wired report in January discussed the Culver City accident, as well as an issue Volvo documents in its driver manual.  Namely:”Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed,” meaning it not only fails to avoid hitting a stopped car that suddenly appears in front, but it might speed up.  That’s exactly what happened in the Salt Lake City accident.

In 2016 a Chinese man driving a Tesla Model S in China struck a road cleaning vehicle when the car it had been following swerved away.  The Model S did not slow down, but struck the road cleaning vehicle at full speed, killing the driver.

Bottom line

These are not autonomous driving systems.  They are driver assist systems, in which car owners are placing to much trust.

I believe the automakers are telling the car owners the right thing.  They’re not using the word “autonomous” or “self driving”.  But it seems some car owners are going overboard.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

Leave a Reply