Tesla Autopilot under US Dept of Transportation scrutiny

Several times now, Tesla vehicles struck stationary first responder vehicles dealing with stuff on the road. The concern is whether Tesla’s autopilot system exacerbates human factors to undermine the effectiveness of the driver’s ability to supervise the vehicle. Remember that Tesla’s autopilot system is not a full-fledged driving automation system (it is not Full Self Driving), but is meant to assist the driver by taking care of some driving tasks. Figures show that Autopilot reduces the risk of certain kinds of traffic accidents. But, another issue is the “passive vigilance” effect, where an automated system lulls drivers into thinking the car will take care of the driving and the driver can do other things.

Electric vehicle charging station guide

I last wrote about this issue in 2020, when the NTSB issued a report about the March 28, 2018 crash on US 101 in Mountain View. In that incident, a Tesla Model X veered into the Gore Point where Hwy 85 splits off from US 101. In that case, the driver had not been paying attention to driving the car, and there were frequent driver inattention alerts. The “passive vigilance” effect involves a driver, thinking the car is taking care of driving the car, ignoring the road, and ending up in a fatal collision. In the case of an Uber test driver whose vehicle struck a pedestrian, the driver had been discovered to be watching television streaming over their phone and was captured on camera with a horrified look of a shock realizing it was too late to avoid a collision.

In this case, the National Highway Traffic Safety Administration has upgraded a Preliminary Evaluation (PE21-020) to an Engineering Analysis (EA 22-002). The initial document, focusing on Autopilot & First Responder Scenes, focuses on whether Autopilot is adequately dealing with first responder vehicles.

There’s a class of folks who work on the road, who sometimes park a vehicle on the road in response to a condition in the road. It might be construction workers, emergency vehicles dealing with another traffic accident, or any of a number of other incidents. Workers on the road are frequently struck and sometimes killed, while performing on-road services. Such an incident happened to me nearly 40 years ago. At the time, I worked as a tow truck driver. On the night of October 9, 1983, I was picking up a car that had broken down in a lane of traffic. While performing that service, another car struck the scene, and I happened to be standing at the point of impact, as I wrote about on Quora. I’ve seen multiple news reports of tow truck drivers being struck and killed in similar circumstances to my incident.

JLONG - 40 Amp, 40ft, J1772 extension cable EV Everything 20 ft Extension Cord J1772 Cable 32 amp Electric Vehicle Charging TeslaTap 50 AMP J1772 25’ Extension Cable (UL Approved)

Evade blocked charging stations with one of these handy J1772 extension cords.

Sponsored

Therefore, the NHTSA is to be applauded for studying risks to those who work on the road.

If Tesla’s autopilot system is to live up to its hype, it would correctly detect a condition such as a stationary vehicle on the road. It should especially detect vehicles with flashing red/blue or yellow lights.

NHTSA preliminary investigation report

The NHTSA document says the initial goal was this:

The investigation opening was motivated by an accumulation of crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes.

They would also use this to study other circumstances where Tesla vehicles were involved in collisions while Autopilot was engaged.

JLONG - 40 Amp, 40ft, J1772 extension cable EV Everything 20 ft Extension Cord J1772 Cable 32 amp Electric Vehicle Charging TeslaTap 50 AMP J1772 25’ Extension Cable (UL Approved)

Evade blocked charging stations with one of these handy J1772 extension cords.

Sponsored

They’ve identified eleven incidents where Tesla vehicles struck vehicles at first responder scenes. These incidents were between January 2018 and July 2021.

Here are some key statements from the preliminary analysis:

The agency’s analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.

All subject crashes occurred on controlled-access highways. Where incident video was available, the approach to the first responder scene would have been visible to the driver an average of 8 seconds leading up to impact. Additional forensic data available for eleven of the collisions indicated that no drivers took evasive action between 2-5 seconds prior to impact, and the vehicle reported all had their hands on the steering wheel leading up to the impact. However, most drivers appeared to comply with the subject vehicle driver engagement system as evidenced by the hands-on wheel detection and nine of eleven vehicles exhibiting no driver engagement visual or chime alerts until the last minute preceding the collision (four of these exhibited no visual or chime alerts at all during the final Autopilot use cycle).

First, Autopilot disengaged less than one second prior to the collision. That’s curious, and I’ll get to that later.

Second, in every incident, the first responder scene was visible for about 8 seconds ahead of the crash. Any driver who is correctly participating in the task of driving would have had ample time to begin braking and avoid the collision. Instead, drivers took no action, despite having their hands on the wheel and appearing to the car to be engaged with driving.

J1772 extension cords Tesla J1772 adapters Open the door to the Tesla Destination Charger network using these Tesla-J1772 adapters

Sponsored

The report mentions an analysis of other incidents not involving first responder scenes. In many other cases, driver inattention was a factor, despite the driver having their hands on the wheel and passing attention checks.

As the NHTSA report goes on to note, a system like Tesla Autopilot is not a panacea. The driver is still responsible for the safe operation of their vehicle, and it is important that drivers remain engaged with the task of driving.

Passive vigilance and driver distraction

I mentioned “passive vigilance” earlier. Earlier NTSB research wrote about this:

Research shows that drivers often become disengaged from the driving task for both momentary and prolonged periods during automated phases of driving.

NTSB: Collision Between a Sport Utility Vehicle Operating With Partial Driving Automation and a Crash Attenuator Mountain View, California March 23, 2018

Yes, I can imagine the experience of Autopilot can lull someone into disengaging from actively driving their car. Back in 2019, Elon Musk claimed that Tesla Autopilot reduces driver fatigue. Yeah, a driver who isn’t actively engaged with the task of driving, because they’re distracted due to a false sense that Autopilot will take care of themselves, could feel less fatigued.

Autopilot aborts control just before a collision

There’s a niggling detail in the NHTSA report to discuss. In many cases, Autopilot “aborted vehicle control” less than one second before the collision. Why was that?

A report by Christiaan Hetzner writing on Fortune discussed the possibility that Tesla cars purposely abort vehicle control in an impending collision so that Tesla can claim that Autopilot was not active at the time of a collision. If Autopilot is not engaged, then Tesla has deniability because full control was returned to the driver.

There isn’t proof to verify that theory. I suggest putting this idea in the back of your mind as a possibility.

Conclusion

To reiterate, drivers are responsible for safely operating their car. As fabulous as Tesla’s Autopilot is — I’ve seen hundreds of reports of Autopilot avoiding collisions — we must not be lulled into a false sense of safety.

Without proper vigilance on the road, your last seconds of life could be shock and horror as you realize your car is about to hit something.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

Leave a Reply