For technology to advance, there must be testing. A lot of testing. It’s not just the type of thing that can be done in a lab, especially when we’re talking about AI. The biggest developments we’ve seen lately are self-driving vehicles that are on the road without human intervention. Believe it or not, there are probably hundreds, if not thousands, of self-driving vehicles currently being tested right now in real-time. That sounds like it’s ripe for an accident to happen, which is why having a personal injury attorney Gilbert AZ residents trust is all the more important.
These self-driving crash tests have been going on for several years. Back in 2018, one of Uber’s self-driving cars actually killed a woman named Elaine Herzberg. She was walking her bike across a street when the self-driving car hit her. The National Transportation Safety Board (or NTSB) took a look at the case and found that Uber’s software was still quite flawed, allowing for the accident that cost Elaine her life. It’s a horrible story that makes us all feel less safe. The problem is, there have been a lot more accidents than we don’t hear about.
Reuters actually reported that in an 18-month span, Uber’s autonomous test vehicles alone were responsible for 37 crashes. This just gives motorists and pedestrians one more reason to be careful on the road. Any of those accidents could’ve been fatal. There will certainly be more fatalities in the future as more companies begin using this technology in the transportation industry. There are even semi-trucks out there today using autonomous software that no doubt has flaws in some capacity at such an early stage.
You might be asking yourself what kind of legal problems autonomous vehicles present when they cause an accident, injury, or death. Who is really liable in these situations? In Uber’s specific case, the company wasn’t found criminally liable even though death was caused by flaws in the software. While the vehicles are driving on their own, they aren’t completely unmanned, at least while being tested. There is a fail-safe that is supposed to be at work.
Typically, an employee is to be in the driver’s seat and take over if the car does something erratic. In Elaine’s case, the driver should’ve been able to spot the bicycle when the software didn’t, stopping in time to prevent the accident. The problem is the driver claimed they were watching something on their cellphone and wasn’t paying attention to what was going on in front of them.
There are several other considerations to make here. The first is that Elaine did cross outside of a legal crosswalk and the second is it was dark out. So, who is to blame for the accident? There are so many questionable pieces here it’s unsure of what will happen. This is so new that even the State of Arizona lacks real regulations regarding autonomous vehicles. The fact that someone died and Uber won’t be charged is heartbreaking, so hopefully, something will be done in the future.
If you find yourself the victim of any type of accident, including autonomous vehicles, be sure to contact a personal injury lawyer at www.theazaccidentinjuryattorney.com/gilbert
"*" indicates required fields
"*" indicates required fields