Uber Is Backpedaling After A Self-Driving Fatality
An unfortunate Arizona woman may have the distinction of being the first person killed by a self-driving car thanks to a collision in mid-March. The SUV in question was an Uber vehicle and driving 40 in a 35 zone, nothing too crazy. Still, it was fast enough to kill someone as she was walking her bike across the road at night.
By all indications, the Uber SUV didn’t so much as slow down before the collision. On the other hand, while the vehicle was in its autonomous mode it also had a driver sitting behind the wheel. The driver was there as a safeguard, to prevent this sort of thing from happening, but since it did it’s all but certain the driver also didn’t notice the pedestrian before it was too late.
This recent accident in Arizona once again brings up the question of who exactly is at fault when an autonomous vehicle crashes or hurts someone. Should the person behind the wheel have been paying more attention and reacted fast enough to stop or mitigate the accident? Do the companies that designed and implemented the self-driving program bear all the responsibility? Car companies would like to say the answer is “no,” but at the same time automakers like Nissan are designing concept cars that let the driver swivel 180 degrees so they can ignore the road and pay full attention to the other passengers.
The last high-profile case happened right here in Florida when a man driving a Tesla gave his Autopilot program free rein. The program misidentified the side of a white truck, and the man died when his vehicle ran into it at full speed. However, Tesla was able to avoid paying damages by saying the program was in beta mode and in need of strict supervision, supervision the driver didn’t provide.
Uber could potentially claim the same thing since their vehicle’s autonomous system was in the middle of a test at the time of the accident. Still, they probably won’t get away unscathed since in this case the driver works for them directly and was testing the system in real-world conditions. This would be why the company has suspended its testing for a second time (the first was after a self-driving car managed to roll itself in 2017).
Autonomy Still Bears Responsibility
This accident illustrates once again that completely autonomous vehicles won’t be appearing in dealerships quite as soon as many people assume. Even if a self-driving car is statistically safer than a human driver, it’s going to have to be a lot safer than that before a major manufacturer will start selling it to the public.
Human drivers cause millions of serious accidents every year, but each driver who causes an accident pays for it using their personal insurance policy. That distributes the damages, and unless there’s a serious design defect the car company doesn’t have to pay a cent. But a self-driving car that screws up, damages property, and hurts people? That’s definitely a design flaw. So if humans cause 5 million accidents per year and self-driving cars are 20 percent safer, that still means automakers will suddenly have to pay for 4 million accidents annually.
As a result, it’s most likely that vehicles will still need drivers behind the wheel even as self-driving software becomes sophisticated enough to handle bad weather and urban driving instead of just dry Arizona highways. Even if the driver doesn’t have to touch the controls at any point during the ride, having the car owner watch over their car will let self-driving companies take on less responsibility for accidents. Then that’s how the situation will stay, at least until self-driving cars become virtually flawless and fully integrated into city traffic networks.
This subject is particularly important here in Florida, a state with some of the deadliest highways in America and a whole lot more rain than what Arizona sees in a year. If and when fully autonomous vehicles appear on showroom floors, Florida is going to be a true stress test of the system. That’s why settling the liability question