Any autonomous driving system relies on the correct software functioning. As we all know, any software has bugs in it, even software that has seen “lots” of testing. In the rush to add autonomous driving to cars (zero fatalities zero emissions is the Nissan/Renault pledge) may produce an unexpected result. Injury or fatalities due to software malfunction. I’ve raised this issue in previous postings, and we now have an example in the form of a Tesla Model S crash in the Netherlands while the car had its autopilot mode activated.
The articles say that employees of “Prinsenbeekse potplantenbedrijf Gebroeders Houtepen gereden” (seemingly some kind of landscaping company) were driving a company truck through Utrecht, and felt a Thump. In the mirror they saw a car, a Tesla Model S, hooked underneath the truck’s rear bumper.
The driver claimed to have been driving on autopilot. No further information is given in the report. It’s not even a complete certainty that the driver was on autopilot mode. For the sake of this, let’s assume a hypothetical crash involving autopilot mode because it’s recognized by Tesla Model S drivers that the current autopilot software is not ready to be relied upon to completely take over driving.
I’ve seen discussion by Model S owners that the current software isn’t good enough to be relied on. And in the autopilot video’s posted on YouTube, while most of them show the software doing amazing things including automated avoidance of what would have been a nasty head-on collision at night on a rainy road, it’s imperfect.
Tesla Motors positions this as “autopilot” mode, not “autonomous driving” mode, specifically because the company recognizes this software is not ready to completely take over driving.
Bottom line is that all software has bugs, and software teams always do an incomplete testing job. Before taking on this work as a writer/blogger/journalist I worked in software quality engineering for over 10 years. That field has a saying — “Testing is never finished, only exhausted”. Every software team has a testing budget and only exceeds that budget for particularly bad errors. The history of software engineering is full of instances of highly regarded software spectacularly failing in the field.
Case in point is Java – the software I used to work on (10+ years in the Java SE Software Quality Engineering team). A few years ago some seriously nasty bugs were found in the fundamentals of the Java Plugin and Java Web Start, and the while software industry had to recommend in the most urgent tones possible to completely abandon Java-in-the-browser. This is Java, a software platform created by some of the best engineers in the world, with a widely regarded security model, etc. Java is still extremely useful in other areas, server-side app software, but not to be used in browser applications.
The question is – pure speculation mind you – what unknown misbehavior exists in the Tesla autonomous driving software? And will anybody die demonstrating those misbehaviors?
Reportely Tesla Motors is asking drivers to stay involved with the driving process, even when autopilot mode is engaged. The drivers may need to take over driving at any time. Which makes me wonder just how full autonomous driving will work in the future.
Supposedly those cars will be set up to take over driving functions, but presumably the car occupants will have to take over driving in some circumstances. But in 30 years will there be anybody left who knows how to drive? What if there’s a 5 year old alone in such a car? What if the person has gone to sleep thinking the car will take care of them? What stories will exist in the future of people dying because of software bugs?
- Clinton claims, in leaked e-mails, anti-fracking groups funded by Putin (Russia) - October 12, 2016
- Dieselgate-like problem found in television sets by new NRDC report - September 22, 2016
- The 238 mile range Chevy Bolt is not a threat to Tesla Motors, Tesla is a threat to itself - September 14, 2016
- BMW, VW, ChargePoint claim to finish East/West charging corridors rings hollow - September 14, 2016
- Chinese Model S auto-pilot crash says we shouldn’t rely on autopilot mode, yet - September 14, 2016
- TI enabling smart EVSE’s by adding WiFi to charging station reference design - August 30, 2016
- Do we need a 500 mile range Tesla or other electric car? - August 27, 2016
- ChargePoint et al argues PG&E’s charging station plan is anti-competitive - August 10, 2016
- Solar Impulse demonstrates implementation of the clean energy paradigm we deserve - July 26, 2016
- Solar Impulse makes history as the first around-the-world solar-electric airplane flight, 40,000+ km with no fuel - July 25, 2016