Who Is Liable in a Self-Driving Car Crash?

As the technology for autonomous vehicles continues to develop, so do car accident liability rules.

By , Attorney UC Law San Francisco
Updated by Dan Ray, Attorney University of Missouri–Kansas City School of Law
Updated 7/09/2025

Self-driving cars are on the road in an ever-growing number of cities across the United States. What sounded like science fiction just a decade or two ago—"autonomous vehicles," "driverless vehicles," "robotaxis"—is now a reality.

As interest and investment in this new technology rise, so too do questions about autonomous vehicle safety and liability. Are self-driving car systems trustworthy? What regulation is needed? And who's legally responsible when an autonomous vehicle crashes or interferes with emergency responders?

Here's an overview of the self-driving car legal landscape. The law in this area is rapidly evolving along with the technology. If you're in a car accident involving a driverless car, talk with a personal injury lawyer who has experience with product liability claims.

What Is a Self-Driving Car?

A self-driving car is a car that's capable of operating—to some extent—without human involvement. Very few cars on the road are fully self-driving now, but many have some level of automation.

The Society of Automotive Engineers (SAE) defines six levels of driving automation, ranging from 0 (no driving automation) to 5 (fully automated). Federal auto safety regulations generally adopt this automated driving system (ADS) classification scheme.

Level 0: No Driving Automation. A human driver is fully responsible for operating a Level 0 vehicle at all times. Some Level 0 vehicles might have driver support features that provide warnings and momentary assistance, like blind spot and lane departure warnings.

Level 1: Driver Assistance. A Level 1 vehicle has at least one driving automation system that can control speed or direction, such as adaptive cruise control or lane centering, but a human driver must constantly supervise the system and perform all other driving tasks.

Level 2: Partial Driving Automation. A Level 2 vehicle has two or more driving automation systems that can control speed and direction, but a human driver is still in control of the systems and all other driving tasks. GM's Super Cruise system—not to be confused with GM's Cruise robotaxi service, which the company shut down—and Tesla's Autopilot are classified as Level 2 partial driving automation technology.

Level 3: Conditional Automation. A Level 3 vehicle is capable of handling all driving tasks, but only under a narrow range of driving conditions. Drivers have to be available to take the wheel if the driving automation system fails. Mercedes-Benz's Drive Pilot, the first Level 3 system certified for consumer use in the United States, is now offered on some S-Class and EQS models in California and Nevada. The system operates at speeds up to 40 mph in heavy traffic conditions and allows drivers to legally take their eyes off the road.

Level 4: High Driving Automation. A Level 4 vehicle can perform all driving functions under certain conditions or in certain locations. For example, Waymo—owned by Google parent company Alphabet—uses Level 4 vehicles as robotaxis on specific routes in a growing number of cities. In June 2025, Tesla launched a similar robotaxi service in Austin, Texas. Not to be outdone, Amazon-owned Zoox has ramped up production of its robotaxis in anticipation of offering commercial taxi services in California and Nevada beginning in 2025 and 2026.

Level 5. Full Driving Automation. A Level 5 vehicle is capable of operating without human involvement under all conditions and on all roadways.

Most vehicles on the road are SAE levels 0-2. Traditional car accident liability rules typically apply when these vehicles crash. But as the level of automation increases, liability shifts away from human drivers toward car manufacturers and software developers. We discuss liability issues in more detail below.

Are Self-Driving Cars Safe?

One of the goals of automated driving systems is to improve vehicle safety. The National Highway Traffic Safety Administration (NHTSA) reports that 40,901 people died in car crashes in 2023. Many of these accidents involved human error, such as drunk driving (12,429 deaths) and speeding (11,775 fatalities).

Proponents of driverless cars argue that taking tired, distracted, impatient, and intoxicated drivers from behind the wheel will dramatically improve the safety of highways and roads. Critics argue that the technology isn't reliable enough yet and that automated systems can fail. Safety experts also question whether drivers are overly confident in the technology and unprepared to quickly transition from hands-off to hands-on driving.

Limited Data Sources

Data on self-driving car safety is limited. In 2021, NHTSA issued a standing general order that requires carmakers to report crashes involving cars that use advanced driver-assistance technology. In most cases, a crash involving an ADS-equipped vehicle must be reported within five calendar days after the company receives notice of it.

As of July 3, 2025, the California Department of Motor Vehicles has received 833 autonomous vehicle collision reports. And, after a lengthy investigation of 956 crashes reported up to August 30, 2023, NHTSA concluded that a "critical safety gap" in Tesla's Autopilot system (SAE level 2) contributed to at least 467 collisions involving 15 deaths and 54 injuries.

New Data Suggest Safety Improvements

Data from robotaxi service Waymo suggest that driverless vehicles offer appreciable safety improvements over human-driven vehicles. Through March 2025, Waymo had driven 71 million driverless miles. Comparing accident data to that for human-driven vehicles in the same cities, Waymo had:

  • 88% fewer serious injury crashes
  • 79% fewer airbag deployment crashes
  • 78% fewer injury-resulting crashes, and
  • 93% fewer pedestrian collisions.

In addition, the nonprofit Insurance Institute for Highway Safety (IIHS) has published data showing the safety benefits of automated crash avoidance and safety features like frontal crash prevention systems, automatic braking, and lane departure warning and prevention technology.

Self-Driving Car Regulations: Recent Developments

NHTSA, the federal agency responsible for motor vehicle safety, is part of the U.S. Department of Transportation. NHTSA typically regulates the safety of vehicles and their components, while state and local governments regulate the use of those vehicles.

The federal government has announced a new initiative to encourage more rapid development of self-driving technology, including a regulatory framework designed to prioritize safety, promote innovation, and enhance deployment of automated vehicles. Among other things, the initiative expands on an existing regulatory exemption to allow for research and testing on American-made self-driving vehicles. Updates to federal motor vehicle safety standards are also expected.

According to the IIHS, as of July 2025, 36 states plus the District of Columbia have laws regulating specific aspects of autonomous vehicle testing or operation. To see the latest legal developments concerning self-driving cars in your state, check the National Conference of State Legislatures' autonomous vehicles legislative database.

Who's Legally Responsible for Self-Driving Car Crashes?

For now, human drivers are still behind the wheel of nearly all vehicles. Driver negligence is the basis of liability in the overwhelming majority of car accident insurance claims and lawsuits.

Emerging Legal Questions

As driverless cars move from experiment to reality, the foundations of our present-day liability system—negligence, product defects, and causation—will remain. But the law will have to make room for new rules as well.

For example, the day isn't far off when courts and juries will have to decide whether collision avoidance hardware and software properly detected a hazard, correctly assessed the nature and severity of the danger, and responded appropriately. Similarly, driverless cars must communicate wirelessly with servers and global positioning satellites. Who's to blame when a wreck happens because a satellite uplink fails or wireless service is interrupted?

The legal system will have to confront these and other questions in the coming years.

Potentially Liable Parties in a Self-Driving Car Accident

Liability for an accident involving a self-driving car will fall into one or more of these categories.

Vehicle operator. Car accidents that involve vehicles in the zone between fully manual and fully automated (SAE levels 2 and 3, for example) will mostly continue to be decided based on traditional negligence rules.

For example, say Chris is at the wheel of a Level 3 vehicle. The system alerts Chris to take control of the vehicle, but Chris is preoccupied with his phone and fails to react. If someone gets hurt in a crash, the injured person can sue Chris and get compensation (called "damages"). But if the automation system fails to alert Chris in time to avoid the accident, liability likely shifts to the maker of the car or autonomous driving system.

Vehicle owner. If the owner of a self-driving vehicle isn't the operator, the owner might share some responsibility for the accident. For example, Uber has partnered with Waymo to use driverless vehicles to pick up passengers and deliver food in certain areas. If someone is injured by a self-driving car hailed through the Uber platform, Uber could be on the hook for injuries and losses caused by the crash.

Vehicle manufacturer. When a self-driving car crash happens because of a defect in the vehicle's design or manufacturing, an injured person can sue the car's maker based on a "product liability" theory of fault. Product liability cases involving driverless vehicles will likely focus on defects in the automation systems' design (such as sensor placements), manufacturing, and inadequate instructions and warnings.

Software developers. Software plays a critical role in autonomous vehicles. Self-driving cars use a combination of sensors, cameras, radar, light-detection and ranging (LIDAR) and artificial intelligence (AI) to operate without a human driver. To drive safely, the vehicle's software receives and processes data from all these systems and directs the car to speed up, slow down, avoid traffic, and more.

When a crash is caused by defective software, the developer might be legally responsible for injuries. Judges and juries will be asked to assess the sufficiency of software design, AI training data, and how both training and real-time data were processed in the split-seconds leading up to a crash.

Maintenance and service providers. The future will bring fleets of robotaxis and driverless delivery trucks. Vehicle manufacturers and others who are responsible for maintaining and servicing those fleets will share liability if something goes wrong due to maintenance or service shortcomings, including failures to timely update software systems.

Other people on the road. Not all accidents involving self-driving vehicles are caused by the self-driving vehicle. If someone else on the road—another driver, a bicyclist, or a pedestrian—negligently causes an accident, that person can be held liable for the damage. For example, if you're a passenger in a robotaxi that gets rear-ended, you'll make a claim against the human driver who hit you, not the robotaxi owner or maker.

Proving Fault in Self-Driving Car Accidents

Proving fault after a self-driving car accident is similar to proving fault for any other vehicle accident. Unlike the wrecks of yesteryear, though, computer data will help tell the story of how many accidents happened.

In 2023, the Institute of Electrical and Electronics Engineers (IEEE) created the world's first data storage system for automated vehicles. IEEE has said that the data generated must be available for crash investigations and for use by software and hardware designers to make improvements.

If you're involved in an accident with a self-driving car, getting your hands on sensor data, camera footage, and system logs could be the car accident evidence you need to prove who's to blame for your injuries.

Can Driverless Cars Get Traffic Tickets?

One potential way to prove that another driver was at fault for a car accident is to show that a driver was cited for a traffic violation. For example, if you're involved in a side-impact (t-bone) accident and the other driver gets ticketed for running a red light, you'll almost certainly be able to prove that the other driver caused the wreck.

But can the police ticket a vehicle with no driver?

In California, the answer is no. Under existing law, fully self-driving vehicles can't be cited for traffic violations. Even though driverless cars have blocked emergency responders, hit a cyclist, and failed to yield to pedestrians, there's not much law enforcement can do when they break the traffic rules in California.

But all that's going to change. Under a new law scheduled to take effect in the summer of 2026, police officers who witness traffic infractions by driverless cars will be able to issue "notices of autonomous vehicle noncompliance." The car's owner, in turn, must report all citations to the CDMV. More reliable data will allow the CDMV to more closely monitor safety and traffic law compliance, and to better regulate driverless vehicle permits.

Other states have taken steps to update their traffic laws to evolve with technology. For example, in Texas, the owner of an automated driving system is the operator and can be cited for violating traffic laws regardless of whether the operator is physically present in the vehicle. (Tex. Trans.Code § 545.453 (2025).) The same is true in Arizona. (Ariz. Rev. Stat.§ 28-9702 (2025).)

How Does Car Insurance Work for Self-Driving Cars?

Driverless cars will, without question, impact both car insurance law and the car insurance industry. At this point, two things are certain. First, state car insurance laws apply, and will continue to apply, to cars with and without automation systems. Second, while it's too early to predict how car insurance law and practice will change, we can anticipate some of the questions that need to be answered.

State Insurance Law Requirements

State insurance requirements for autonomous vehicles vary significantly. Many states don't distinguish between automated and non-automated vehicles for liability insurance purposes. About half of all states have enacted specific coverage requirements for Level 4 and Level 5 vehicles.

California, for example, requires a minimum of $5 million in liability insurance for Level 3, Level 4, and Level 5 vehicles. Nevada requires the same amount. In Florida, a fully autonomous vehicle must have liability coverage of at least $1 million. Coverage amounts of $1 million or $5 million are common in other states with minimum insurance requirements.

Questions About Insurance and Autonomous Vehicles

For the most part, state insurance laws center on traditional concepts of fault. All states require drivers to carry liability insurance to cover damages caused by their negligence. In a few states, no-fault laws add a step to the process, mandating that drivers first look to their own personal injury protection (PIP) insurance to pay for medical bills, lost wages, and sometimes other losses. As more driverless or fully-autonomous vehicles take to the roads, this fault-centric model will need to adapt.

For example, uninsured and underinsured motorist coverages pay for an injured person's damages when an at-fault uninsured or underinsured driver causes a collision. Will these coverages still apply if the uninsured or underinsured vehicle is fully automated or driverless? In such cases, the wreck might be caused not by operator negligence but by a defective sensor or software program, calling the basis for coverage into doubt.

For the same reason, no-fault laws might need to be re-examined. When fault is attributable to a defective satellite link or dropped wireless coverage, will no-fault PIP coverage still apply?

If data eventually show that self-driving cars reduce the number of car crashes, insurance rates for autonomous vehicles might go down. But for now, vehicles with autonomous driving features are more expensive to insure because they cost more to buy and repair, and because the risks to operate them are still uncertain.

Talk to a Lawyer

If you've been in an accident involving a car with an autonomous system, talk to a lawyer. The law in this area is complex and rapidly changing. A lawyer can help you figure out who you can sue for the accident and how to get the information you need to prove your case.

Learn more about how an attorney can help with your car accident claim. When you're ready, here's how to find a lawyer who's right for you.

Car Accident Claim Tool

Have you been in a car accident?

Take our free car accident quiz to find out if you're likely to get a settlement.

Take The Next Step
Find Out Your Auto Injury Claim's Worth
Join 215 others who chose us to connect with a auto injury attorney today — for free.

Are you seeking compensation for an injury?

How It Works
  1. Describe your case — it takes 60 seconds
  2. Get matched with local, auto accident attorneys for free
  3. Receive a comprehensive case evaluation