Car accidents are common—in 2020, there were more than 5.2 million crashes in the U.S., resulting in 38,824 fatalities. Because human error is often a factor in car accidents, proponents of self-driving cars (or autonomous vehicles) believe that reliable self-driving technology could someday save millions of lives by eliminating human drivers altogether.
But in the meantime, what happens in a self-driving car accident? What happens if technology that’s intended as a boon to safety is involved in an accident that results in injury or death?
In this article, we’ll take a look at self-driving technology and its impact on determining liability in car accidents.
In any discussion of self-driving cars, it’s important to distinguish between full automation—in which no driver is needed—and driver-assistance technologies—automotive features that support drivers or take over some driving responsibilities temporarily but do not eliminate the need for a licensed driver at the wheel.
The Society of Automotive Engineers (SAE) lists six levels of driving automation, from Level 0 (no automation) all the way to Level 5 (full automation). Although most major auto manufacturers are actively developing self-driving technologies, they are still years away from marketing a fully automated car that can be purchased from a dealership.
Some driver-assistance technologies have been in use for decades, including features like power steering, anti-lock brakes, and cruise control. Car makers have been incorporating many newer driver-assistance features into late model vehicles over the last decade. These include the following:
Many manufacturers have developed SAE Level 2 cars, which combine assistance with steering and speed control. Prominent examples include Tesla Autopilot and General Motors Super Cruise. Although “autopilot” seems to imply that no driver is needed, SAE Level 2 cars are only semi-autonomous; they require the driver’s attention at all times.
To date, only one car manufacturer has brought an SAE Level 3 car to market—the Honda Legend (available only in Japan). SAE Level 3 cars can drive on their own in certain conditions. For instance, the Honda Sensing Elite system allows Legend drivers to relax while the automation navigates traffic jams.
According to the Governors Highway Safety Administration, 38 states have passed laws or issued executive orders regarding autonomous vehicles. California is one of twenty states that allows both testing and use of autonomous vehicles without a human operator. For example, the autonomous ride-hailing services Waymo and Cruise are in business in San Francisco.
Once again, when considering the question of who is liable in a self-driving car accident, it’s crucial to distinguish between Level 3 fully autonomous vehicles and the Level 2 semi-autonomous vehicles that are on the road today.
First, let’s talk about fully autonomous vehicles. There aren’t very many of these cars on the road yet, so you’re unlikely to be impacted by a fully autonomous vehicle accident.
In the vast majority of these autonomous car accidents over the last nine years, automation was not at fault. If there was evidence that an accident was caused by a malfunction in an autonomous driving system—for example, a mechanical problem, sensor error, or software glitch—the vehicle’s manufacturer (and/or the manufacturer of the part that failed) might be liable for any damage or injuries caused by the accident.
All vehicles equipped with Level 2 semi-autonomous driving technology must have an attentive licensed driver at the wheel at all times.
A new California law that went into effect at the beginning of 2023 illustrates how state laws are evolving to account for semi-autonomous vehicles. The law stipulates that manufacturers cannot market semi-autonomous driving features “using language that implies or would otherwise lead a reasonable person to believe, that the feature allows the vehicle to function as an autonomous vehicle.” The law is likely to impact Tesla’s use of the marketing terms “autopilot” and “full self-driving.”
So, what happens if a Tesla Autopilot crashes? And who is liable for the resulting damage or injuries?
Because current technology (like Tesla’s) requires driver attention at all times, in most semi-autonomous car accidents, the driver can be held liable if they were negligent in operating the vehicle. For example, if a Tesla Autopilot prompts the driver to take control of the vehicle, but the driver is distracted and doesn’t respond, any resulting accident is likely to be considered the driver’s fault.
If the driver is operating the vehicle properly and something goes wrong with a driver-assistance feature, the vehicle manufacturer may be liable. For example, suppose a vehicle’s self-driving feature locks the accelerator and doesn’t allow the driver to take control. If an accident results, the manufacturer is likely to be held liable.
Self-driving car lawsuits, like self-driving car technology, are a new phenomenon. As technology develops and more cars with autonomous capabilities take to the roads, society will continue to adapt.
Legislature and federal agencies pass new laws and issue new regulations. And consumers and personal injury lawyers take to the courts to work out the legal implications of self-driving car accidents.
If you’ve been injured in an accident with an autonomous or semi-autonomous vehicle, there may be challenges in determining fault. Although not many lawyers are explicitly representing self-driving accident victims, the experienced Bakersfield car accident lawyers at Chain | Cohn | Clark are up to the challenge.
If you need a self-driving accident attorney, contact us today for a free consultation.
for your free case evaluation
Fill out the simple form below and we’ll contact you about your case right away.