Chinese Model S auto-pilot crash says we shouldn’t rely on autopilot mode, yet

There have been several one or two fatalities this year of Tesla Model S drivers who were perhaps overly trusting the autopilot mode.  In January 2016, well before the well-known crash in Florida, a fatality occurred when a Tesla Model S drove into the back of a street sweeper truck.  Going by a video report in Chinese TV, the collision occurred on a grey-rainy day, the Model S was driving in the high-speed lane (far left), the driver of the car ahead of the Model S saw the street sweeper truck and switched lanes, but the Model S did not do so.  The street sweeper truck was of course driving slowly, in order to sweep the street, and therefore the Model S slammed into the back of the street sweeper at full speed.  The driver, a 23 year old man, was killed instantly.

The above image is a screen capture from dashcam video moments before the collision.  Judging by the video, the car did not slow down before the collision, and therefore autopilot mode seems to have not detected the street sweeper truck.  (see the video on video.sina.com.cn)

According to a Google Translate of an article on money.163.com, the accident occurred “within the jurisdiction of Hong Kong and Macao high-speed section” and that “Handan, Hebei Ci County detachment battalion, brigade Ci County police” responded.

The police investigation showed there were no brake marks, suggesting the car did nothing to slow down.  The video does show some tire marks at the point of the accident, that obviously resulted from the post-collision movements of the Model S.

The driver had served in the Chinese military previously, and had worked driving large trucks.  That suggests the driver had good driving skills.

According to a TheStreet posting, Tesla Motors says they’ve tried to work with “our customer” (presumably the family of the dead man) to investigate the root cause.  Unfortunately the car in question was too heavily damaged to transmit vehicle logs to Tesla Motors.  In particular it’s not verified that this Model S was driving under Autopilot at the time of the collision.  Going by what I see in the video, it looks to me that it was in Autopilot mode.  This driver should have seen that truck had he been paying attention, as did the driver just ahead of him.  The last few moments prior to the crash that truck is clearly visible, and had I been driving I’d have slammed on the brakes and I’d be steering to avoid a collision.  The dashcam video shows none of that, suggesting he wasn’t paying attention, and instead trusting the car’s Autopilot mode to take care of things.

chinese-model-s-collision-aftermath-2 chinese-model-s-collision-aftermath

The incident gives us a couple data points with which to gauge whether the Tesla Autopilot system should be trusted.  First – the Model S clearly failed to recognize the street sweeper truck and failed to take action.  Second – the driver should have maintained attention on the road, as Tesla Motors warns drivers to do.  That the driver of the preceding car saw the truck and avoided it indicates that both the Model S and its driver should have done so as well.

A couple days ago Tesla Motors held a teleconference call explaining an autopilot software update that would roll out in a few days.  According to a Reuters report, Elon Musk said during that call “Perfect safety is really an impossible goal. It’s about improving the probability of safety. There won’t ever be zero fatalities, there won’t ever be zero injuries.”

J1772 extension cords

Between that statement, and the Tesla Motors blog post describing the Autopilot 8.0 update, I’m reminded of a Science Fiction short story I read a long time ago.  Titled something like “Cold Equations” it concerned a young woman who stowed away on a space ship running supplies to a remote colony on another planet.  Shortly after take-off the pilots instruments warned him that there was extra weight on-board, and that calculations showed the ship would crash unless that extra weight were ejected.  The weight was of course that young woman who’d snuck on-board so she could visit her brother on that planet.

The pilot had “no choice” because the cold hard facts of the situation said that many people would die if that cargo ship were to not arrive at the destination.  Therefore, the woman had to leave through the airlock so that the ship could arrive safely.

I describe this because Tesla Motors is describing the statistical probabilities of a collision, or the avoidance of collisions, as to their choices of Autopilot algorithms, and indeed whether to continue supplying the Autopilot feature at all.

What do we make from the fact that no other automaker is going this far with an Autopilot driver assist feature?  Other automakers have added driver assist features, such as automatically matching speed with a leading vehicle, or lane-keeping, but none have gone as far as Tesla has.  Does it say the other automakers are more risk averse?  More conservative about what they’ll supply to their customers?

J1772 extension cords Tesla J1772 adapters Open the door to the Tesla Destination Charger network using these Tesla-J1772 adapters

Sponsored

Maybe Tesla Motors is being risky, and jumping the gun in eagerness to deliver this technology?  I understand that Tesla Motors is repeatedly warning Model S/X drivers to stay in control of their car, and is rigging the software so drivers are forced to pay attention.  But it’s easy to find indications that drivers are letting the Autopilot mode take over while they do other things.  Therefore, I have to ask whether by supplying Autopilot mode in its current form Tesla Motors is deluding customers into thinking Autopilot is more than what it actually is?

Tesla Motors is careful to use the name Autopilot to convey the idea this is not a self-driving car, but an advanced form of driver assist.  But, aren’t some Model S/X clearly putting too much trust in the Autopilot mode?

As a software engineer with decades of experience (I don’t just write insightful blog posts) I’m leery of self-driving cars.  All software has bugs.  The adage “never trust version 1.0 of anything” applies as much to self-driving car software as it does to computer applications.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

2 Comments

  1. I agree with your thesis, but a few minor points.

    You open with — “There have been several fatalities this year of Tesla Model S drivers who were perhaps overly trusting the autopilot ”

    It should be noted that there has been ONE confirmed fatality, and now ONE additional suspected autopilot fatality, for a total of TWO (under the assumption the second one was with autopilot).

    “Several”, by definition, means more than two but less than “many”.

    “Several” probably isn’t the most accurate word to choose considering there are at most two fatalities..

    Secondly, to your point about other automakers, it should be noted that Mercedes went so far (beyond Tesla) to advertise their autopilot features as “SELF-DRIVING”. This is an autopilot that’s been independently tested as much worse than Tesla’s own efforts. Amid a firestorm of criticism, Mercedes had to pull both their print and television advertisements.

    So, at least Mercedes, was neither risk averse nor conservative with their own promotions, which, in fact, exceeded any claims ever made by Tesla.

    http://fortune.com/2016/07/29/mercedes-withdraws-us-ad-touting-self-driving-car/
    http://time.com/4431956/mercedes-benz-ad-confusion-self-driving/

    BTW, interesting comparison of Mercedes vs Tesla autopilots. You can find both verdicts at the bottom.

    http://www.thedrive.com/tech/4591/the-war-for-autonomous-driving-2017-mercedes-benz-e-class-vs-2017-tesla-model-s

    • Thank you for the clarification. In my mind I may have conflated in a couple other fatalities – such as the guy in Belgium, but in that case he was almost certainly not under the influence of Auto Pilot. Lots of guys die driving fast cars at high speed on winding country roads, and the guy in Belgium was unfortunately such a case.

Leave a Reply