The back-up driver of an Uber self-driving car that killed a pedestrian in the US has been charged with negligent homicide.
The victim, 49-year-old Elaine Herzberg, was hit by the self-driving Volvo as she cycled across a road in Tempe, Arizona, in 2018.
Rafaela Vasquez was in the driver’s seat of the Volvo and had the role of safety driver to take control of the vehicle in an emergency. But according to investigators, Ms. Vasquez had been streaming a television show at the time of the collision.
She pleaded not guilty and the trial has been scheduled for February 2021.
Uber will not face criminal charges, after a decision last year that there was “no basis for criminal liability” for the corporation.
The accident was the first death on record involving a self-driving car, and resulted in Uber ending its testing of the technology in Arizona.
Thatcham Research’s director of insurance research, Matthew Avery, has shared his thoughts on the fatal incident.
Avery believes the Arizona case highlights key liability and safety considerations at a time when the UK government is currently running a consultation on the adoption of UN Regulations around Automated Lane Keeping Systems (ALKS), that would allow hands-free driving on UK roads as early as January 2021.
“The Uber case throws the spotlight on the UK government’s wish to implement automated driving within the next 12 months and highlights significant liability and safety questions that need to be ironed out quickly,” Avery says.
“In automated mode, will the ALKS be able to steer the car to avoid a person or debris in the road? And if the driver is unresponsive and can’t take back control, will it be able to find a ‘safe harbour’ off the carriageway, or will it be a danger to itself and other road users if it comes to a stationary halt in the road?”
Avery added: “During the last four years, Thatcham Research and the ABI have worked closely with UK government and global legislators to define 12 Principles that ensure safe automated driving. These principles highlight the fact that any systems that require driver intervention cannot be classified as automated.
“The judgements in the Uber case, and the current UK government consultation on ALKS, highlight the significant challenges legislators, law makers, insurers and vehicle manufacturers face to ensure the safe adoption of automated driving.”