There have been
several one or two fatalities this year of Tesla Model S drivers who were perhaps overly trusting the autopilot mode. In January 2016, well before the well-known crash in Florida, a fatality occurred when a Tesla Model S drove into the back of a street sweeper truck. Going by a video report in Chinese TV, the collision occurred on a grey-rainy day, the Model S was driving in the high-speed lane (far left), the driver of the car ahead of the Model S saw the street sweeper truck and switched lanes, but the Model S did not do so. The street sweeper truck was of course driving slowly, in order to sweep the street, and therefore the Model S slammed into the back of the street sweeper at full speed. The driver, a 23 year old man, was killed instantly.
The above image is a screen capture from dashcam video moments before the collision. Judging by the video, the car did not slow down before the collision, and therefore autopilot mode seems to have not detected the street sweeper truck. (see the video on video.sina.com.cn)
According to a Google Translate of an article on money.163.com, the accident occurred “within the jurisdiction of Hong Kong and Macao high-speed section” and that “Handan, Hebei Ci County detachment battalion, brigade Ci County police” responded.
The police investigation showed there were no brake marks, suggesting the car did nothing to slow down. The video does show some tire marks at the point of the accident, that obviously resulted from the post-collision movements of the Model S.
The driver had served in the Chinese military previously, and had worked driving large trucks. That suggests the driver had good driving skills.
According to a TheStreet posting, Tesla Motors says they’ve tried to work with “our customer” (presumably the family of the dead man) to investigate the root cause. Unfortunately the car in question was too heavily damaged to transmit vehicle logs to Tesla Motors. In particular it’s not verified that this Model S was driving under Autopilot at the time of the collision. Going by what I see in the video, it looks to me that it was in Autopilot mode. This driver should have seen that truck had he been paying attention, as did the driver just ahead of him. The last few moments prior to the crash that truck is clearly visible, and had I been driving I’d have slammed on the brakes and I’d be steering to avoid a collision. The dashcam video shows none of that, suggesting he wasn’t paying attention, and instead trusting the car’s Autopilot mode to take care of things.
The incident gives us a couple data points with which to gauge whether the Tesla Autopilot system should be trusted. First – the Model S clearly failed to recognize the street sweeper truck and failed to take action. Second – the driver should have maintained attention on the road, as Tesla Motors warns drivers to do. That the driver of the preceding car saw the truck and avoided it indicates that both the Model S and its driver should have done so as well.
A couple days ago Tesla Motors held a teleconference call explaining an autopilot software update that would roll out in a few days. According to a Reuters report, Elon Musk said during that call “Perfect safety is really an impossible goal. It’s about improving the probability of safety. There won’t ever be zero fatalities, there won’t ever be zero injuries.”
Between that statement, and the Tesla Motors blog post describing the Autopilot 8.0 update, I’m reminded of a Science Fiction short story I read a long time ago. Titled something like “Cold Equations” it concerned a young woman who stowed away on a space ship running supplies to a remote colony on another planet. Shortly after take-off the pilots instruments warned him that there was extra weight on-board, and that calculations showed the ship would crash unless that extra weight were ejected. The weight was of course that young woman who’d snuck on-board so she could visit her brother on that planet.
The pilot had “no choice” because the cold hard facts of the situation said that many people would die if that cargo ship were to not arrive at the destination. Therefore, the woman had to leave through the airlock so that the ship could arrive safely.
I describe this because Tesla Motors is describing the statistical probabilities of a collision, or the avoidance of collisions, as to their choices of Autopilot algorithms, and indeed whether to continue supplying the Autopilot feature at all.
What do we make from the fact that no other automaker is going this far with an Autopilot driver assist feature? Other automakers have added driver assist features, such as automatically matching speed with a leading vehicle, or lane-keeping, but none have gone as far as Tesla has. Does it say the other automakers are more risk averse? More conservative about what they’ll supply to their customers?
Maybe Tesla Motors is being risky, and jumping the gun in eagerness to deliver this technology? I understand that Tesla Motors is repeatedly warning Model S/X drivers to stay in control of their car, and is rigging the software so drivers are forced to pay attention. But it’s easy to find indications that drivers are letting the Autopilot mode take over while they do other things. Therefore, I have to ask whether by supplying Autopilot mode in its current form Tesla Motors is deluding customers into thinking Autopilot is more than what it actually is?
Tesla Motors is careful to use the name Autopilot to convey the idea this is not a self-driving car, but an advanced form of driver assist. But, aren’t some Model S/X clearly putting too much trust in the Autopilot mode?
As a software engineer with decades of experience (I don’t just write insightful blog posts) I’m leery of self-driving cars. All software has bugs. The adage “never trust version 1.0 of anything” applies as much to self-driving car software as it does to computer applications.
- FBI alleges VW Executives were informed and approved of diesel cheat - January 11, 2017
- Climate Change making the Arctic abnormally warm – Again - December 18, 2016
- Clinton claims, in leaked e-mails, anti-fracking groups funded by Putin (Russia) - October 12, 2016
- Dieselgate-like problem found in television sets by new NRDC report - September 22, 2016
- The 238 mile range Chevy Bolt is not a threat to Tesla Motors, Tesla is a threat to itself - September 14, 2016
- BMW, VW, ChargePoint claim to finish East/West charging corridors rings hollow - September 14, 2016
- Chinese Model S auto-pilot crash says we shouldn’t rely on autopilot mode, yet - September 14, 2016
- TI enabling smart EVSE’s by adding WiFi to charging station reference design - August 30, 2016
- Do we need a 500 mile range Tesla or other electric car? - August 27, 2016
- ChargePoint et al argues PG&E’s charging station plan is anti-competitive - August 10, 2016