Self Driving Car Accidents: What You Should Know

AUTHOR: A.J. Bruning | March 21, 2020
Self Driving Car Accidents: What You Should Know

WHAT’S DRIVING THE DISCUSSION ON
AUTONOMOUS VEHICLES?

For decades now, movies, TV shows, and science-fiction writers have envisioned the future of human travel. Often, those depictions involve vehicles that fly, respond to voice commands, and even talk back. In some ways, those predictions are coming true.

With technologies like Amazon Echo, Echo Dot, and Tap being integrated into vehicles, drivers really can talk to their cars, instructing them to turn on the lights at home or look up directions to the nearest gas station. And although we’re not commuting to work in high-flying vehicles just yet, the dawn of a new era of transportation has definitely begun with the birth of self-driving cars.

The idea of kicking back and letting your car handle the stress of the interstate at rush hour may seem ideal. But the convenience of self-driving cars comes with many questions, including many revolving around causes of crashes.

WE HAVE BEEN EXPLORING THESE QUESTIONS AND REVIEWING THE LATEST RESEARCH TO PREDICT HOW THIS TECHNOLOGY WILL CHANGE THE WAY WE TRAVEL.

Here's What the Experts Have to Say

WILL SELF-DRIVING CARS BE PROGRAMMED TO MAKE MORAL DECISIONS?

If a family of four is in a self-driving car and a distracted pedestrian steps into the road, should the car be programmed to swerve and possibly crash to avoid the pedestrian? What if there’s only one person in the car and it’s a group of schoolchildren who step into the road?

Deciding how a vehicle should base its decisions on the greater good was the subject of a recent study outlined in Science Magazine. The authors of the study found that:

“Even though participants approve of autonomous vehicles that might sacrifice passengers to save others, respondents would prefer not to ride in such vehicles. … Respondents would also not approve regulations mandating self-sacrifice, and such regulations would make them less willing to buy an autonomous vehicle.”

HOW WILL THE MORALITY OF SELF-DRIVING CARS BE LEGISLATED?

The law clearly must evolve with new technology. For example, as smartphones posed new types of safety risks to drivers, many states enacted laws on texting while driving. Thus, we can expect new laws to address issues raised by autonomous vehicles.

But how far will those laws go, and will they put buyers of self-driving cars in the hot seat? The authors of the study in Science Magazine raise an interesting point:

“If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm’s decisions?”

HOW MUCH CONTROL SHOULD BE GIVEN UP TO A MACHINE?

The way these autonomous vehicles (AV) should process information in order to make decisions is also up for debate. Experts are split on whether the AV should make decisions based on simple logic or “deep learning.”

The logic approach involves programming the machine with the many driving rules humans are supposed to abide by, such as stop at red and go on green. But we all know humans are adaptable and can bend or break rules based on the situation. That’s where deep learning comes in.

Deep learning involves feeding the machine countless scenarios and allowing it to detect patterns and make decisions based on those. However, that approach gives a lot of power to the machine and makes it impossible to trace how the machine makes any given decision. As a recent analysis in Forbes points out:

“As well as deep learning networks may perform at driving 99.9% of the time, this lack of interpretability becomes a real concern on those rare occasions when an AV makes the wrong decision and causes an accident. In those situations, humans have no way to explain what went wrong and no way to troubleshoot the error. Using deep learning in AV decision making, then, entails ceding control and even understanding to the machine. Not everyone thinks this tradeoff is worth it.”

WHAT WOULD HAPPEN IF HACKERS TOOK THE WHEEL?

Handing over control of your car to a computer means potentially handing it over to hackers as well. Experts have raised the possibility of vehicle systems being held for ransom, with hackers forcing owners to pay to take back control of their cars. Back in 2015, hackers also proved that they could take control of a Jeep through the vehicle’s software and crash it.

Despite the criminal nature of such acts, there are also issues of liability that lie with the manufacturers. As a report in Left Lane points out:

“Automakers have faced criticism from the computer security industry in recent years as vehicles become more electronically integrated and come equipped with wireless communications systems. Together, these features provide pathways for hackers to remotely access a vehicle and control its systems, potentially including steering and acceleration.”

HOW COULD INFRASTRUCTURE PITFALLS CAUSE PROBLEMS FOR SELF-DRIVING CARS?

Self-driving cars cannot operate in a vacuum. They need certain support mechanisms that can only be provided through infrastructure. As an article in Government Technology magazine explains:

“Because of the radical change that AVs will bring to the current system of transportation, infrastructure pitfalls will become a glaring need. Often, AVs need clear lane striping, places to store the data collected by driving and if they run on electricity a more robust charging network. Without properly anticipating the sometimes opaque challenges, the system could be crippled in its infancy.”

HOW WILL SAFETY LEGISLATION AFFECT CONTINUED INNOVATION

As consumers, automakers, and legislators work through the scenarios of who could be held liable in self-driving car accidents, experts say there’s a fine balance between keeping consumers safe and benefitting from advances in technology. As Harry Lightsey of General Motors told Government Technology magazine recently:

“I think the key is going to be providing room for folks to be innovative and to try new things. At the same time, a vehicle is a product that is so strongly tied to safety that we can’t just ignore that. Safety has to be at the top of everybody’s list in terms of how this change occurs.”

WHO WILL BE RESPONSIBLE FOR SELF-DRIVING CAR CRASHES?

Although we may see new laws develop to account for the new challenges autonomous vehicles bring, there are many laws already in place that should apply to crashes involving self-driving cars. As attorney A.J. Bruning explains:

Product liability laws hold car manufacturersdesigners, and others in the chain of distribution responsible for defective systems and parts within a vehicle that end up causing harm to the consumer. Still, drivers could also be held responsible if their negligent actions somehow led the autonomous vehicle to crash. For example, if they were not properly maintaining the vehicle to make sure it was fit to be on the road. We are also likely to see situations where a car company and a driver share in the blame for a crash, with arguments arising over to what degree each party is responsible.”

A.J. Bruning

Founder

I was born and raised to represent individuals who have been needlessly injured. I mean that literally. At a young age my father would tell me about the clients he was representing. I would meet them and take pride in their admiration of my father. I always knew I wanted to be a lawyer and represent clients that needed my help.

Author's Bio

You Might Also Be Interested In