Tesla Model S autopilot crash in Netherlands begs buggy software question

Any autonomous driving system relies on the correct software functioning.  As we all know, any software has bugs in it, even software that has seen “lots” of testing.  In the rush to add autonomous driving to cars (zero fatalities zero emissions is the Nissan/Renault pledge) may produce an unexpected result.  Injury or fatalities due to software malfunction.  I’ve raised this issue in previous postings, and we now have an example in the form of a Tesla Model S crash in the Netherlands while the car had its autopilot mode activated.

The report is from autoblog.nl and prinsenbeeknieuws.nl (both are written in Dutch, so read them using Chrome to get a translation).

The articles say that employees of “Prinsenbeekse potplantenbedrijf Gebroeders Houtepen gereden” (seemingly some kind of landscaping company) were driving a company truck through Utrecht, and felt a Thump.  In the mirror they saw a car, a Tesla Model S, hooked underneath the truck’s rear bumper.

The driver claimed to have been driving on autopilot.  No further information is given in the report.  It’s not even a complete certainty that the driver was on autopilot mode.  For the sake of this, let’s assume a hypothetical crash involving autopilot mode because it’s recognized by Tesla Model S drivers that the current autopilot software is not ready to be relied upon to completely take over driving.

I’ve seen discussion by Model S owners that the current software isn’t good enough to be relied on.  And in the autopilot video’s posted on YouTube, while most of them show the software doing amazing things including automated avoidance of what would have been a nasty head-on collision at night on a rainy road, it’s imperfect.

Tesla Motors positions this as “autopilot” mode, not “autonomous driving” mode, specifically because the company recognizes this software is not ready to completely take over driving.

Bottom line is that all software has bugs, and software teams always do an incomplete testing job.  Before taking on this work as a writer/blogger/journalist I worked in software quality engineering for over 10 years.  That field has a saying — “Testing is never finished, only exhausted”.   Every software team has a testing budget and only exceeds that budget for particularly bad errors.  The history of software engineering is full of instances of highly regarded software spectacularly failing in the field.

Case in point is Java – the software I used to work on (10+ years in the Java SE Software Quality Engineering team).  A few years ago some seriously nasty bugs were found in the fundamentals of the Java Plugin and Java Web Start, and the while software industry had to recommend in the most urgent tones possible to completely abandon Java-in-the-browser.   This is Java, a software platform created by some of the best engineers in the world, with a widely regarded security model, etc.  Java is still extremely useful in other areas, server-side app software, but not to be used in browser applications.

The question is – pure speculation mind you – what unknown misbehavior exists in the Tesla autonomous driving software?  And will anybody die demonstrating those misbehaviors?

Reportely Tesla Motors is asking drivers to stay involved with the driving process, even when autopilot mode is engaged.  The drivers may need to take over driving at any time.  Which makes me wonder just how full autonomous driving will work in the future.

Supposedly those cars will be set up to take over driving functions, but presumably the car occupants will have to take over driving in some circumstances.  But in 30 years will there be anybody left who knows how to drive?  What if there’s a 5 year old alone in such a car?  What if the person has gone to sleep thinking the car will take care of them?  What stories will exist in the future of people dying because of software bugs?

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.
Bookmark and Share
  Autonomous Vehicles, Self-Driving cars, Tesla Autopilot. Bookmark.

About David Herron

David Herron is a writer and software engineer living in Silicon Valley. He primarily writes about electric vehicles, clean energy systems, climate change, peak oil and related issues. When not writing he indulges in software projects and is sometimes employed as a software engineer. David has written for sites like PlugInCars and TorqueNews, and worked for companies like Sun Microsystems and Yahoo.

One Comment

  1. I also used to be a software tester, Agree completely on the comments around that process. Very scary to think that lives depend on software testing process. Encouraging news is that even with some mistakes the automation apparently makes less mistakes that humans given the same number of tests.

Leave a Reply