Are Auto-Drive Cars Safe? MO Car Accident Lawyers

when auto drive cars will hit pedestrians

As with most things, there are fans and foes, pros and cons when it comes to the auto-drive car industry, with dramatic examples from 2016 of the upside and the downside.

For example, a Missouri man used the Autopilot on his new Tesla Model X to reach a hospital more than 20 miles away after suffering a pulmonary embolism that doctors say could have killed him. On the flip side, the driver of a 2015 Tesla Model S left the vehicle in Autopilot and was killed when it allegedly failed to brake and collided with a tractor-trailer in Florida.

Those who say auto-drive cars cause accidents might argue that the Missouri man put others at risk by allowing the car to drive itself while he was unable to safely monitor it. Even he said it might have been a good idea to call an ambulance.

Proponents of semi-autonomous cars likely would point to the National Highway Traffic Safety Administration’s findings on the fatal crash in Florida. The NHTSA concluded that a “safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.”

Naysayers get the final word on the fatal crash, though, because the NHTSA report also says that closing the investigation does not rule out the possibility that there was a defect.

Auto-Drive Safety Features and Operation

The Tesla in the fatal Florida crash was a 2015 Model S. These vehicles are on the high end of the driver-assist car spectrums of price and technology. But you don’t have to spend more than $100,000 to get a car with driver-assist functions. Consumer Reports online identifies 10 semi-autonomous car safety features that can be found on a variety of affordable vehicles:

  • Forward-collision warning systems use laser, radar, or cameras to “see” and alert drivers to a looming collision with objects in front of the vehicle. Some systems can even identify pedestrians.
  • Automatic emergency braking is paired with forward-collision warning systems and will brake if the driver does not promptly respond to warnings.
  • A blind-spot warning system scans areas to the left and right and warns the driver not to change lanes or merge if there is an obstacle.
  • A rear cross-traffic alert system senses a risk and notifies the driver if traffic could enter the vehicle’s path as it backs up.
  • Backup cameras activate when a vehicle is put in reverse, with the image displayed in the rearview mirror or on the console.
  • Automatic high-beam headlights enhance nighttime visibility.
  • Lane-keeping assist senses when the vehicle is drifting out of the lane and makes a steering correction.
  • Adaptive cruise control uses lasers, radar, cameras, or a mix thereof to adjust speed and keep the vehicle at a safe distance from traffic in front of the car.
  • Parking-assist systems alert the driver, at low speeds, to the proximity of objects.
  • Lane-departure warning systems use cameras, lasers, or infrared sensors to alert the driver that the vehicle is drifting out of its lane.

Technological Challenges Identified

Tesla has been a trailblazer in the semi-autonomous car market, which likely explains why it has some accidents on its ledger. It answered those crashes with upgrades to its Autopilot system, including measures to ensure drivers don’t take their hands off the steering wheel for too long. Nissan is doing that, too, and GM has been developing eye-tracking software intended to ensure the driver is paying attention to the road at all times.

One big issue is users’ tendency to push the technology beyond its intended use. These cars are not designed to drive themselves unassisted. They are designed to assist a driver who is supposed to be alert whenever the wheels are turning. That’s why critics argue that even the terminology used by manufacturers is deceptive. Some take particular exception to Tesla’s use of the word “Autopilot.”

Before technology catches up with a driver’s desire or need to nap at 65 mph and makes the product affordable, there are challenges to completely overcome:

  • When the car recognizes a scenario it can’t handle, there must be a handoff from machine to human. In the Florida crash that killed a Tesla driver, the car did not “see” the problem or alert the driver, and it is possible there would not have been time for the driver to respond if alerted. A lot of distance can be covered in the seconds it takes an inattentive driver to respond to a warning.
  • Semi-autonomous cars rely on varying arrays of GPS, laser, camera, radar, and other technologies to “see.” For now, even the best mechanical eyes a vehicle can put on the road can’t cope with all weather, lighting conditions, physical road hazards, poor road markings, or mistakes made by drivers of other cars.
  • How do you address ethics with algorithms? In short, is the mathematics used to program these machines adequate to the task of distinguishing between, say, a boulder, a deer, and a pedestrian if all three are in the path of a car and it must decide which to strike? And what if the choice comes down to striking a grandmother or a toddler?
  • What about hacking? Auto-drive functions are overseen by computers. Could a terrorist hack and disable a metropolitan area’s semi-autonomous cars during rush hour? Could a stalker commit murder by hacking your vehicle and running you off a cliff?
  • There’s fallibility to cope with, too. Ninety-four percent of crashes are attributed to human error, so fans of auto-drive technology have a good argument that crash numbers will fall as the self-drive technology rises. But if computers crash, so will cars. Where do you assign blame if the accident becomes a courtroom issue? Was it the computer maker? The car manufacturer? A vendor who made a part? Was driver error the reason for the technology’s failure? Did the road cause the accident?

Here’s another curve in the long road to infallible self-driving cars: If you are involved in a crash and one or all of the vehicles were driving themselves, who loses if it becomes a courtroom battle?

The Bruning Law Firm Stays on the Cutting Edge of Technology

In Missouri, the St. Louis car accident attorneys of The Bruning Law Firm pride themselves on being on the cutting edge of evolving law, and that includes any legal precedents that will be set as self-driving cars continue to roll out new legal challenges and physical risks.

Vehicle accidents with auto-drive cars are uncharted territory. But our dedicated car accident lawyers stand ready to represent crash victims in a variety of scenarios. Contact us today to schedule a free consultation.