In March, the trial of 46-year-old Rafaela Vasquez began in Maricopa County Superior Court. Four years ago, Vasquez worked as a safety driver for Uber’s autonomous test cars in Tempe, Arizona. She has pleaded not guilty to negligent homicide after the self-driving car she oversaw struck and killed Elaine Herzberg, a 49-year-old pedestrian.

Vasquez was behind the wheel as the robot car drove at a steady 40 miles per hour down the street on a clear Sunday night. Herzberg was pushing her bike across the same street. The automated Volvo XC90 SUV did not slow down until after it struck and killed her. Herzberg was the first pedestrian to be killed by self-driving technology.

The prosecution will say Vasquez was distracted, watching The Voice on her phone at the time of the accident. They will say that anyone who climbs behind the wheel of a car has a responsibility to lawfully operate their vehicle safely.

The defense will deny wrongdoing on Vasquez’s part and claim she was checking messages from her Uber employers when the collision occurred.

Arizona prosecutors decided not to charge Uber with criminal liability, even though a National Transportation Safety Board investigation found that the collision occurred because of both human error and Uber’s “inadequate safety culture.”

After the crash, Uber suspended the testing program in Tempe and other test markets—Pittsburgh, San Francisco, and Toronto. The major tech companies and state legislatures slowed the rollouts of several autonomous vehicle initiatives. But Uber’s robot cars were rolling again before a year had passed with lower regulated speeds and increased restrictions.

Uber does not have a monopoly on the dangerous robot car market. In 2016, Tesla Motors disclosed the first self-driving death when a Model S’s Autopilot sensor failed to slow the car before it smashed into the trailer of an 18-wheel truck. The 40-year-old Tesla driver was killed.

The National Highway Traffic Safety Administration (NHTSA) began an investigation after at least four deaths and 29 serious crashes were linked to Tesla’s Autopilot feature. This investigation led to another probe examining more than 750,000 Tesla vehicles manufactured since 2014. These vehicles were unexpectedly crashing into emergency vehicles while the Autopilot was engaged. Since 2018, at least 11 Tesla models have crashed into emergency vehicles without explanation.

Back in 2015, tech companies like Uber, Lyft, and Waymo (Google’s self-driving cars) migrated away from California’s increased restrictions to the dry climates and wide-open roads of Arizona. To lure Big Tech operations and the revenue associated with it, officials declared the state a regulation-free zone.

These robot cars of the future were supposed to remove distracted drivers from the traffic equation, but this trial serves as a reminder that self-driving technology is still in development. It also illustrates that state and local governments are still in need of a system of regulation.

Despite the occurrence of incidents, federal and state lawmakers, including California, are taking a lenient approach with autonomous-car developers, loosening safety standards.

Defining Self-Driving Technology

When self-driving vehicles do not necessarily mean fully autonomous. All the self-driving cars on U.S. roads have autonomous elements, but still rely on a human driver.

According to the NHTSA, self-driving automation falls into one of five levels:

  • Level 1: Advanced Driver Assistance System (ADAS) – Automation that assists a human driver steer, brake, or accelerate.
  • Level 2: Advanced Driver Assistance System (ADAS) – On the second level of ADAS, vehicles go past assisting to simultaneously control steering, braking, and accelerating in certain situations. This system still needs a human driver to closely monitor and execute some driving tasks.
  • Level 3: Automated Driving System (ADS) – In some situations, the vehicle could perform all driving tasks, but a human driver must be present and available to take over when the system prompts them.
  • Level 4: Automated Driving System (ADS) – In the advanced version of ADS, vehicles can autonomously monitor the driving environments and conditions and do all driving in some circumstances. A human driver is still required but not needed in most circumstances.
  • Level 5: Automated Driving System (ADS) – This is the most advanced system of self-driving technology. A vehicle does all the driving in every condition. Humans are hands-off passengers.

There is no doubt the robots are coming. All the major car manufacturers are following Tesla and Waymo’s lead—developing, testing, and producing autonomous cars for eager consumers. Self-driving tests are being conducted in 1,400 vehicles by more than 80 U.S. companies.

Self-driving cars are surging through their most productive period, with more than 54 million vehicles projected to be using at least Level 1: ADAS by 2024. The robot-car market is expected to reach $36 billion by 2025.

When to Blame the Robot’s creator

According to the National Law Review, injuries sustained in self-driving car crashes are usually less severe than crashes involving human drivers, but autonomous cars are involved in crashes at a higher rate than human-driven vehicles. Self-driving car accidents happen on an average of 9.1 per million miles driven. Human drivers crash on an average of 4.1 per million miles driven. Simply said, humans crash half as much as robots.

It gets even more tricky when car companies make duplicitous claims. Tesla markets their Autopilot function as autonomous. But after an accident, they insist they warn Tesla owners that active driver supervision is mandatory even when Autopilot is engaged.

Determining whether collisions occur because of vehicles’ automated elements or human error while operating is an arduous task. Car companies can be found negligent in a self-driving vehicle crash by failing to install critical updates on the vehicle’s software or not providing vital maintenance to the vehicle, which can lead to a liability case.

In Louisiana, a new wave of driverless delivery vehicles prompted new state legislation. Senate Bill 147 builds a framework of requirements, rules, and regulations to introduce self-driving vehicles safely and legally into traffic. If these robot travelers hurt humans because their system or sensors fail—the blame will not fall on the robots. The fault will find its way back to its creators.

contact-image-min

With over 50 years of legal experience serving families in the New Orleans area and surrounding Louisiana communities, our firm takes pride in providing clients with personalized legal services tailored to individual needs.

  • “I walked in as a client and walked out as a friend. If you are good at what you do, you will never need expensive ads to prove it. Good outshines the rest and in volatile times such as now always go for the good and at Charbonnet Law Firm you will be treated as humans and not just a case file. It’s my word of mouth endorsement and I approve this message.”

    A. Bajaj

  • “It’s easy to get caught up in lies. These days it’s hard to weed out good from bad. The best endorsement is what comes from people, not the lawyers’ own endorsements, paid celebrity endorsements or actors telling you they made millions. Charbonnet law firm has no expensive ads because they have happy clients. I am one of them!”

    J. Kelly

  • “If I had to sum it up in short Charbonnet Law Firm has a team that treats everyone with respect and esteem. Kindness is apparent as soon as you walk into the office, don’t be just a case number! I am not just saying it I am a client too!”

    B. Smith

  • “Best Firm in New Orleans. Great service. These guys treated me like family whenever I got in a tight situation. Clean office and great location in the Metairie area.”

    Q. Lee

SCHEDULE A FREE
Consultation

Home contact Form