Any autonomous driving system relies on the correct software functioning. As we all know, any software has bugs in it, even software that has seen “lots” of testing. In the rush to add autonomous driving to cars (zero fatalities zero emissions is the Nissan/Renault pledge) may produce an unexpected result. Injury or fatalities due to software malfunction. I’ve raised this issue in previous postings, and we now have an example in the form of a Tesla Model S crash in the Netherlands while the car had its autopilot mode activated.
The articles say that employees of “Prinsenbeekse potplantenbedrijf Gebroeders Houtepen gereden” (seemingly some kind of landscaping company) were driving a company truck through Utrecht, and felt a Thump. In the mirror they saw a car, a Tesla Model S, hooked underneath the truck’s rear bumper.
The driver claimed to have been driving on autopilot. No further information is given in the report. It’s not even a complete certainty that the driver was on autopilot mode. For the sake of this, let’s assume a hypothetical crash involving autopilot mode because it’s recognized by Tesla Model S drivers that the current autopilot software is not ready to be relied upon to completely take over driving.
I’ve seen discussion by Model S owners that the current software isn’t good enough to be relied on. And in the autopilot video’s posted on YouTube, while most of them show the software doing amazing things including automated avoidance of what would have been a nasty head-on collision at night on a rainy road, it’s imperfect.
Tesla Motors positions this as “autopilot” mode, not “autonomous driving” mode, specifically because the company recognizes this software is not ready to completely take over driving.
Bottom line is that all software has bugs, and software teams always do an incomplete testing job. Before taking on this work as a writer/blogger/journalist I worked in software quality engineering for over 10 years. That field has a saying — “Testing is never finished, only exhausted”. Every software team has a testing budget and only exceeds that budget for particularly bad errors. The history of software engineering is full of instances of highly regarded software spectacularly failing in the field.
Case in point is Java – the software I used to work on (10+ years in the Java SE Software Quality Engineering team). A few years ago some seriously nasty bugs were found in the fundamentals of the Java Plugin and Java Web Start, and the while software industry had to recommend in the most urgent tones possible to completely abandon Java-in-the-browser. This is Java, a software platform created by some of the best engineers in the world, with a widely regarded security model, etc. Java is still extremely useful in other areas, server-side app software, but not to be used in browser applications.
The question is – pure speculation mind you – what unknown misbehavior exists in the Tesla autonomous driving software? And will anybody die demonstrating those misbehaviors?
Reportely Tesla Motors is asking drivers to stay involved with the driving process, even when autopilot mode is engaged. The drivers may need to take over driving at any time. Which makes me wonder just how full autonomous driving will work in the future.
Supposedly those cars will be set up to take over driving functions, but presumably the car occupants will have to take over driving in some circumstances. But in 30 years will there be anybody left who knows how to drive? What if there’s a 5 year old alone in such a car? What if the person has gone to sleep thinking the car will take care of them? What stories will exist in the future of people dying because of software bugs?
- No, Tesla is not phasing out the J1772 adapter - April 17, 2019
- Near destruction of the Notre Dame contains lesson in thinking ahead - April 16, 2019
- Nepotism bites VP Joe Biden as he starts 2020 Presidential run - April 13, 2019
- Tesla almost kills $35k Model 3, launches lease program, still shows misleading pricing - April 13, 2019
- Tesla CEO Elon Musk giving flawed charging advice on Twitter - April 11, 2019
- Did Toyota spur the electrification craze? What does history say about electric vehicles? - April 9, 2019
- Elon Musk claims Tesla Autopilot drivers less mentally fatigued after long drives - March 26, 2019
- Why doesn’t Tesla install CHAdeMO or ComboChargingSystem charging stations? - March 22, 2019
- Tesla plans to keep more stores open, re-increase prices, backtracking on previous plan - March 11, 2019
- Imperial Beach CA backs off from “managed retreat” from the ocean that was their plan for sea level rise response - March 7, 2019