March’s pedestrian fatality from an Uber self-driving car can be blamed on Uber’s faulty software, an inattentive operator, and an inattentive pedestrian, according to an initial NTSB report into the fatality. There’s blame to go around, and perhaps it is Uber’s software that is most to blame? I don’t understand the actions of the pedestrian, whose lack of attention contributed to the accident.
On-board sensors detected an “object” (the pedestrian) 6 seconds before the impact. It made varying classifications of that object until 1.3 seconds before impact when it determined emergency braking maneuvers were required. Except that it seems the emergency braking system was disabled, because Uber expected the vehicle operator to be paying attention and to intervene as needed.
The operator (a test engineer) was paying most of her attention to information displays on the dashboard, and very little attention to the road. At about 1 second before impact, the operator finally noticed the pedestrian and started some maneuvers with the steering, but it was too late.
The pedestrian was wearing dark clothing, crossing an expressway in a non-well-lit area, and apparently not paying attention to traffic. The pedestrian may have been high on drugs and marijuana.
In other words, there is plenty of blame to share between Uber’s software, the inattentive driver/operator, and the inattentive pedestrian.
Summarizing the NTSB report
The accident occurred about 9:58 p.m., on Sunday, March 18, 2018. The location was northbound N. Mill Ave in Tempe Arizona. The vehicle was a modified 2017 Volvo XC90 equipped with Uber’s prototype self-driving vehicle system. It was occupied by a person who seems to be an Uber test engineer. The car left Uber’s garage at 9:14 p.m. to run an established test route, and it was on the send loop through the test. The car had been in computer control since 9:39 PM, for about 19 minutes. In other words, this vehicle was not in service shuttling passengers around, but the system was being tested.
The autonomous driving package consisted of forward- and side-facing cameras, radars, LIDAR, navigation sensors, and a computing and data storage unit integrated into the vehicle. There were 10 additional cameras mounted on the vehicle recording video of the trip, including the view of the driver. The car also included Volvo’s driver assist system including a collision avoidance function with automatic emergency braking, known as City Safety, as well as functions for detecting driver alertness and road sign information. Uber’s system disabled Volvo’s system whenever the car was under control of Uber’s computers, so at the time of the accident those systems were disabled.
The operator is supposed to remain attentive and intervene if necessary. The operator is also to be monitoring a set of instruments in the center dashboard column, marking anything requiring later review.
According to the NTSB report, the operator was monitoring the information displays on the dashboard. Video from on-board camera’s show the operator spending most of her time glancing downward, and paying little attention to the road. It was easy to therefore conclude the operator was playing with her cell phone, but both her work and personal phones were elsewhere in the car and were not used until after the accident.
This video is from one of the on-board cameras:
According to data recorded by the self-driving system, the pedestrian was detected 6 seconds before the accident. However, the computer did not have a positive identification of what object had been detected. Instead it classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before collision, the computer decided an emergency braking maneuver was required.
It’s meant that the operator is attentive and override the computer as needed. It’s not clear from the report whether the vehicle initiated an emergency braking maneuver. Instead the report says that “emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior”, that the operator is supposed to intervene, and that “The system is not designed to alert the operator”.
The operator began to intervene less than a second before impact, using the steering wheel, and then began braking braking less than a second after the impact.
This picture is from the NTSB report, and the NTSB caption says it all.
Video recorded by the car indicates the pedestrian wore dark clothing, and chose to cross Mill Ave in an area with little illumination. The pedestrian did not look in the direction of the Uber vehicle. While the bicycle had front and rear reflectors, and a headlight, they were pointed perpendicularly to the Uber vehicle. There were no side reflectors, which might have helped with being visible. The last thing noted by the NTSB report is that toxicological tests of the pedestrian were positive for methamphetamine and marijuana.
The accident location was N. Mill Ave in Tempe Arizona, and it’s noted in the NTSB report that the pedestrian could have used a normal crosswalk 360 feet away at the intersection with Curry Rd.
This map tries to show the accident location, but we’ll need a bigger-scale map to better understand the location.
This picture is from Google Maps [link]. Curry Ave is the intersection at the top of the image. At the bottom is a freeway overpass, and beyond the freeway is a river. Mill Ave crosses that river on two bridges.
The big X in the middle is a paved section that looks like it’s meant to be a crosswalk or something. Except there is signs at every section of that X warning pedestrians, telling them to use the crosswalk at Curry Ave.
I don’t understand why the pedestrian might have been in that location, and what she might have been doing. Reportedly she had several grocery bags on the bicycle, and presumably was taking groceries home. But there’s no store in the vicinity, and it’s hard to imagine where she could have been coming from. If she were heading north on the south-bound side of Mill Ave, she would have just crossed the bridge, but there’s nothing on the other side of the bridge she could have come from.
Maybe she had just gotten off the bus – there’s a bus stop on south-bound N. Mill Ave – and needed to cross Mill Ave. But if it were me, I’d have walked up to the intersection with Curry Rd before crossing Mill Ave.
There’s a whole lot of nothing in that immediate vicinity. There’s a few office buildings, a couple parks, and a riverfront. None of which are likely places for this pedestrian to have been at that time of night.
Why did the city put paving across that median if it was not intended to be used by pedestrians? Did that contribute to this pedestrian thinking to cross the road at that location rather than at the intersection? But, where was she coming from and where was she going that she needed to cross the road at that location?
The easiest target to blame in this case is Uber, whose systems were perhaps misconfigured. These are prototype test vehicles and should perhaps be configured to be more cautious. The six seconds the on-board computer was confused about identifying that pedestrian could have made the difference between life and death.
Clearly the test engineer was not doing her job very well. Maybe Uber should have had two engineers on board, one who could concentrate on driving, the other on collecting data. It may be that Uber’s engineer was overloaded dealing with data collection and did not have enough attention for the road.
That pedestrian, however, was being irresponsible. There’s all kinds of things that pedestrian could have done to avoid being hit – such as looking at the road before crossing, or going to the intersection first.
- Tesla CEO Elon Musk stands accused of believing his own press releases - August 18, 2018
- Trump Administration moves to cancel clean car standards, undoing Obama’s CAFE win - August 2, 2018
- Rich Rebuilds and Tesla’s opinion on the Right to Repair - August 2, 2018
- Scott Pruitt resigns as EPA Administrator, proving someone can be too corrupt for Trump - July 5, 2018
- Tesla Model 3 and Zero SR drivers beat REFUEL lap time records in 10th annual REFUEL race - July 4, 2018
- Tesla’s new risk from Trump Administration trade war with China and other countries - June 27, 2018
- Volkswagen I.D. R smashes overall record at Pikes Peak International Hill Climb - June 24, 2018
- Tesla closing a dozen or more former Solar City installation centers in layoffs - June 22, 2018
- Uber test driver in self-driving car in fatality accident was streaming Hulu - June 22, 2018
- Misplaced trust in driver assist (Tesla Autopilot) systems can cause huge problems - June 19, 2018