Liability for Hilton Head Car Accidents When Autopilot System was in Control
A driver was killed in an auto accident recently and the crash was investigated by the National Highway Traffic Safety Administration. The circumstances surrounding this accident were unique because the collision happened in a Tesla while its autopilot system was engaged.
The 40-year-old driver was likely watching Harry Potter, according to the LA Times, when his Tesla drove underneath a big rig that was turning left on a highway. The driver died, prompting NHTSA to investigate both this incident and other routine accidents that had happened involving Tesla's autopilot function.
What Happens After Car Accidents with Autopilot Engaged?
NHTSA determined that there was no need for a recall of Tesla cars after the investigation. The agency found that drivers are still ultimately responsible for ensuring that a car accident does not happen, even when the vehicle is equipped with an autopilot system.
Tesla cooperated with the investigation. The CEO of Tesla commented there have been fewer accidents per miles driven in Tesla cars as compared with standard vehicles. The idea that automated driving systems are safer is a common one. Many argue that crashes will become less frequent as driverless cars take over because human error can be eliminated by automated systems.
However, questions are being raised about what will happen when accidents do happen. The key issue: will the manufacturer of the autopilot system be held liable for accidents or will drivers?
On the current autopilot system, Tesla's on-screen instructions warn drivers they are still the ones who have to be in ultimate control of the vehicle. NHTSA affirmed this when the agency indicated that drivers are still saddled with the ultimate responsibility for avoiding crashes.
However, Scientific American suggests that when autopilot systems advance, more cars will become self-driving, and as car manufactures start to market these cars as a driving alternative, they could be the ones who are liable when crashes happen.
According to Scientific American: "When a computerized driver replaces a human one, experts say the companies behind the software and hardware sit in the legal liability chain-not the car owner or the person's insurance company. Eventually, and inevitably, the car makers will have to take the blame."
If a car maker tells a consumer that the autopilot system can handle the driving, then it stands to reason that the car maker would have a duty to ensure the autopilot system actually did prevent accidents effectively. A failure to do so could be caused by a defect in the system.
Proving this type of case could be very complicated, especially over the next few years as the law evolves and as apportionment of liability is hashed out in the courts. Crash victims will need to make sure they have an attorney who keeps up-to-date with how the law develops on liability for crashes in cars featuring autopilot technology.