Self-driving cars are on the road in an ever-growing number of cities in the United States. What sounded like science fiction just a decade or two ago—"autonomous vehicles," "driverless vehicles," "robotaxis"—is now a reality.
As interest and investment in this new technology rise, so too do questions about autonomous vehicle safety and liability. Are self-driving car systems trustworthy? What regulation is needed? And who is legally responsible when an autonomous vehicle crashes or interferes with emergency responders?
Here's an overview of the self-driving car legal landscape. The law in this area is rapidly evolving along with the technology. If you're in a car accident involving a driverless car, talk with a personal injury lawyer who has experience with product liability claims.
A self-driving car is a car that is capable of operating without human involvement. Very few cars on the road are fully self-driving now, but many cars have some level of automation.
The Society of Automotive Engineers (SAE) defines six levels of driving automation ranging from 0 (no driving automation) to 5 (fully automated).
Level 0: No Driving Automation. A human driver is fully responsible for operating a Level 0 vehicle at all times. Some Level 0 vehicles may have driver support features that provide warnings and momentary assistance, like blind spot and lane departure warnings.
Level 1: Driver Assistance. A Level 1 vehicle has at least one driving automation system that can control speed or direction, such as adaptive cruise control or lane centering, but a human driver must constantly supervise the system and perform all other driving tasks.
Level 2: Partial Driving Automation. A Level 2 vehicle has two or more driving automation systems that can control speed and direction, but a human driver is still in control of the systems and all other driving tasks. GM's Super Cruise system and Tesla's Autopilot are classified as Level 2 partial driving automation technology.
Level 3: Conditional Automation. A Level 3 vehicle is capable of handling all driving tasks, but drivers have to be available to take the wheel if the driving automation system fails. In January 2023, Mercedes-Benz became the first company to receive approval for a Level 3 autonomous system.
Level 4: High Driving Automation. A level 4 vehicle can perform all driving functions under certain conditions or in certain locations. For example, Cruise and Waymo use Level 4 vehicles as robotaxis on specific routes in a growing number of cities.
Level 5. Full Driving Automation. A Level 5 vehicle is capable of operating without human involvement under all conditions and on all roadways.
Most vehicles on the road are SAE levels 0-2. Traditional rules of car accident liability typically apply when these types of vehicles crash. But as the level of automation increases, liability shifts away from human drivers toward car manufacturers and software developers.
One of the goals of automated driving systems is to improve vehicle safety. According to the National Highway Traffic Safety Administration (NHTSA), 38,824 people died in car crashes in 2020. Many of these fatal crashes involved human error, such as drunk driving (11,654) and speeding (11,258). Proponents of driverless cars argue that taking tired, distracted, impatient, and intoxicated drivers from behind the wheel will dramatically improve the safety of highways and roads.
Critics of driverless cars argue that the technology isn't reliable enough yet and that automated systems can fail. Safety experts also question whether drivers are overly confident in the technology and unprepared to quickly transition from hands-off to hands-on driving.
The data on self-driving car safety is limited. In 2021, NHTSA issued a standing general order that requires carmakers to report crashes involving cars that use advanced driver-assistance technology. According to NHTSA, automated vehicles were involved in 130 crashes from July 2021 to May 2022.
Other reports suggest that autonomous vehicle collisions are increasingly frequent. As of August 2023, the California Department of Motor Vehicles alone has received 649 Autonomous Vehicle Collision reports. And, according to Car and Driver, Tesla Autopilot (SAE level 2) has been involved in 736 crashes from 2019 to June 2023, including 17 fatal accidents.
NHTSA is the federal agency responsible for motor vehicle safety. NHTSA is part of the U.S. Department of Transportation (USDOT). NHTSA typically regulates the safety of vehicles themselves, while state and local governments regulate the use of those vehicles.
According to the National Conference of State Legislatures (NCSL), lawmakers in 29 states have passed laws related to autonomous vehicles. Governors in 10 other states have issued executive orders on the topic. To see the latest legal developments concerning self-driving cars in your state, check NCSL's autonomous vehicles legislative database.
For now, human drivers are still behind the wheel of nearly all vehicles on the road and driver negligence is the basis of liability in the overwhelming majority of car accident insurance claims and lawsuits.
But as driverless cars move from experiment to reality, the legal system will have to find new ways to compensate victims of autonomous car accidents. Liability for an accident involving a self-driving car may fall into one or more of the following categories:
Vehicle operator. Car accidents that involve vehicles in the zone between fully manual and fully automated (SAE levels 2 and 3, for example) will probably continue to be decided based on traditional negligence rules. For example, let's say Chris is sitting behind the wheel of a Level 3 vehicle. The system alerts Chris to take control of the vehicle but Chris is zoned out on his phone and fails to react. If the car crashes and someone gets hurt, the injured person can sue Chris and get compensation for medical bills and other losses (called "damages"). But if the automation system fails to alert Chris in time to avoid the accident, liability likely shifts to the maker of the car or autonomous driving system.
Vehicle owner. If the owner of the self-driving vehicle isn't the operator, the owner may share some responsibility for the accident. For example, Uber has partnered with Waymo to use driverless vehicles to pick up passengers and deliver food in certain areas. If someone is injured by a self-driving car via the Uber platform, Uber may have to pay for some of the injuries and losses caused by the crash.
Vehicle manufacturer. If a self-driving car crash happens because of a defect in the vehicle's design or manufacturing, an injured person may sue the car makers based on a "product liability" theory of fault. Product liability cases involving driverless vehicles will likely focus on defects in the automation systems' design (such as sensor placements), manufacturing, and inadequate instructions and warnings.
Software developers. Software plays a central role in autonomous vehicles. Self-driving cars use software to collect and process information about the environment. The vehicles use a combination of sensors, cameras, radar, and artificial intelligence (AI) to operate without a human driver. If a software defect causes a crash, the software developer may be on the hook for injuries and damage caused by the accident.
Maintenance and service providers. Proponents of autonomous vehicles envision fleets of robotaxis and driverless delivery trucks. If that dream becomes a reality, the company responsible for maintaining and servicing those fleets may share liability if something goes wrong with one of the autonomous vehicles and it crashes.
Other people on the road. Not all accidents involving self-driving vehicles are caused by the self-driving vehicle. If someone else on the road—another driver, bicyclist, or pedestrian—negligently causes an accident, that person is legally responsible for the damage. For example, if you're a passenger in a robotaxi that gets rear-ended, you'll make a claim against the human driver who hit you, not the robotaxi owner or maker.
Proving fault after a self-driving car accident is similar to proving fault for any other vehicle accident, but data may help tell the story of how the accident happened.
In 2023, the Institute of Electrical and Electronics Engineers (IEEE) created the world's first data storage system for automated vehicles. IEE has said that the data generated must be available for crash investigations and for use by software and hardware designers to make improvements.
If you're involved in an accident with a self-driving car, getting your hands on sensor data, camera footage, and system logs may be the car accident evidence you need to prove fault for the accident.
It's too early to say how driverless cars may impact the car insurance industry. For now, state car insurance laws apply to cars with and without automation systems.
If self-driving cars ultimately reduce the number of car crashes in the future, insurance rates for autonomous vehicles may be lower. But, for now, vehicles with autonomous driving features are more expensive to insure because they cost more to buy and repair.
If you've been in an accident involving a car with an autonomous system, talk to a lawyer. The law in this area is complex and rapidly changing. A lawyer can help you figure out who you can sue for the accident and how to get the information you need to prove your case.
Learn more about how an attorney can help with your car accident claim. When you're ready, you can connect with a lawyer directly from this page for free.