Jurisdiction – USA
Due to lack of legislations and judicial precedents, there is no clear answer to this question. This article will analyze the different approaches that can be adopted for attributing liability in collisions involving autonomous cars.
Last year, there were two reported instances of Tesla’s cars being involved in accidents while working on autopilot mode – a collision in Florida which resulted in the death of the driver and another in China wherein it is believed that the crash occurred while the autopilot was engaged. This raises the obvious issue of liability. Elon Musk has been noted saying that unless the accident occurred because of a design defect, Tesla would not be liable for the same. Companies like Google, Mercedes and Volvo, on the other hand, have taken a completely different stand by stating that they would accept full liability whenever their cars are in autonomous mode.
This post will explain who would be liable if an autonomous car is involved in an accident.
Levels of Automation
Liability would depend on the degree of automation of the car – the more automated the car the higher the chances of the carmaker being held responsible for the accident. The Society of Automotive Engineers (SAE) has established an international standard for classification of automated driving systems –
- Level 0 (No Automation) – The driving of the car depends entirely on the human driver.
- Level 1 (Driver Assistance) – Has a single driver assistance system; automation is limited to steering or acceleration/deceleration.
- Level 2 (Partial Automation) – Has one or more driver assistance systems; both steering and acceleration/deceleration can be automated.
- Level 3 (Conditional Automation) – All aspects of driving are automated with the condition that the human driver will intervene if necessary. Such automation only extends to some driving modes.
- Level 4 (High Automation) – Automation extends to all aspects of driving without the need for human intervention. However, this can only be used in some driving modes.
- Level 5 (Full Automation) – Everything is automated in all driving modes. It most closely resembles a perfect human driver.
In Levels 0 – 2, the driving environment is required to be monitored by the human driver; whereas in Levels 3 – 5 it is done by the automated driving system.
The Florida Crash
The automation in the Tesla car involved in the Florida crash was Level 2 or Level 3, which means that the driver was required to keep his hands on the wheels and be ready to take control of the car if the need arose. In response to the crash, Tesla posted an official statement on its website explaining that when the autopilot feature is turned on in a car, the driver is informed that the autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times.” The driver has to maintain control and responsibility for the vehicle even while using the autopilot feature.
When the crash took place, the driver was watching a movie instead of being ready at the steering wheel. Though no lawsuit has been instituted against Tesla, if it were, the following arguments could be made under the current legal regime –
Strict Liability – The legal representatives of the deceased could argue that Tesla should have strict liability for any accidents in which their cars are involved when in autopilot mode. Strict liability applies when manufacturers take all possible measures to create safe products but the product still contains a defect. A manufacturer can be held strictly liable when it sells an article that proves to have a defect which causes injury to a human being [Greenman v. Yuba Power Products Inc.].
Accordingly, carmakers can be held liable for autopilot accidents even if they exercised due care while designing their technology. This would be especially relevant in Tesla’s case since the company admits that the accident occurred because “neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.” Despite Tesla’s best efforts to ensure that the technology was full-proof, the failure of the autopilot to notice the tractor makes it a fit case to impose strict liability on the manufacturer.
As has already been mentioned, Volvo, Mercedes and Google have already consented to a strict liability regime by accepting responsibility for accidents when the car is in autonomous mode. The strict liability argument would be even stronger in high-level autonomous cars that operate completely independent of human drivers.
Design Defects – The claimants could have also potentially argued that the accident occurred because of a design defect. A product is considered defective in design when its potential harms could have been reduced or avoided by adopting a reasonable alternative design [Restatement (Third) of Torts]. The crux of this argument would be that the design of the car was defective in some manner such as failure of the autopilot to inform the driver early enough about taking control to avoid the collision. Since the car was Level 2 or 3, its design could be considered defective if it requested the driver to take control without giving the driver adequate time to respond to the situation.
This argument, however, is unlikely to succeed after National Highway Traffic Safety Administration (NHTSA) found in an investigation that the circumstances of the accident were outside the capabilities of the Autopilot and Automatic Emergency Braking System (AEB). Tesla had already provided warnings about the limitations of the technology and therefore required the driver to be engaged at all times (which also protects it from liability for misrepresentation). The report concluded that there were no safety-related defects in the design or performance of the AEB or Autopilot system.
Driving Towards Fully-Autonomous Cars
Within a decade, our streets will predominantly comprise of Level 4 and Level 5 fully-autonomous cars. This is good news because these self-driving cars could significantly improve road safety and save thousands of lives every year – a marked improvement from the 1.2 million road deaths occurring every year.
As has been explained above, if the accident occurs due to a defect with the vehicle, strict liability could be imposed upon manufacturers of such vehicles. Manufacturers could also be held liable for negligence if it can be proved that due care was not exercised in designing the car. For instance, a case for negligence against the manufacturer can be made if the car had a faulty braking system which resulted in the collision.
Some have also argued that imposing strict liability onto manufacturers of autonomous cars could slow innovation in this area. Therefore, it has been suggested that the user/owner of the car should be held liable instead – to make this case, a paper has drawn a parallel between liability for injury caused by a pet dog and an autonomous car. As it is with pet dogs, the owner of the object causing the injury – the autonomous car – should be held liable for damage caused to third parties. The owner of the car can defray these costs through insurance, which would require creation of an appropriate insurance framework that covers autonomous cars [as has been suggested here].
The question of liability can only be accurately answered depending on the facts of each case. The legal principles stated in this article, however, need to be enshrined in legislations around the world so that this issue can be resolved conclusively.
This Brookings study conducts a detailed exposition into how the products liability regime in the US can address the problem.
Featured image from here.
 Driving mode means different driving scenarios such as traffic jams, high speech cruising, etc.