In Massachusetts, the Arlington Board of Selectmen has acknowledged that future of driving is likely in autonomous vehicles. The town recently gave the nod to the town manager to pave the way for its streets to be a testing ground for autonomous or self-driving cars. Currently, there are two autonomous car companies operating in the Boston area: nuTonomy and Optimus Ride. A number of companies are applying to test their vehicles in Arlington, which requires them to submit a test plan for approval.
To date, 21 states have enacted laws related to self-driving vehicles. Uber, Google and Tesla have been testing and operating autonomous vehicles in California and elsewhere. Tesla vehicles even have a self-driving feature. But all self-driving vehicles require a human driver to be behind the wheel for safety purposes in case the vehicle malfunctions. So far, there have been only a handful of mishaps, although last year a driver behind the wheel of one was killed in California.
Are Autonomous Cars Safer?
Human error, according to the National Highway Traffic Safety Administration (NHTSA) accounts for 94% of all accidents. The causes in the overwhelming number of reported accidents include speeding, drunk driving, distracted driving, making an unsafe lane change, running a red light, and other conduct all related to human factors. An autonomous vehicle is designed to eliminate human error as the main factor in an accident. Although no machine is perfect, experts and futurists hope that car accidents will eventually become nonexistent. That is, if all vehicles become autonomous.
The NHTSA, which is the sole agency that approves new vehicle technologies, has identified 5 categories of driving automation. Categories 1 and 2 are primarily human-driven while 5 is total automation. At levels 1 and 2, any accident caused by the vehicle is considered the fault of the human driver. At the higher levels, the car manufacturer is to be held liable.
Liability in an Autonomous Car Accident
As indicated, no autonomous vehicle can operate in any state without a human driver behind the controls. If the vehicle fails to brake in time, runs a red traffic signal, speeds, or fails to notice a pedestrian in a crosswalk, the human driver is supposed to spring to action and take over the controls. If for some reason, the driver was not paying attention, fell asleep, or was otherwise distracted, then fault will inevitably fall on the driver, although the driver and the injured party would likely make claims against the vehicle manufacturer as well.
Manufacturers could presumably seek to minimize its liability by clearly instructing those behind the wheel, who must be licensed drivers, to be focused on driving and to be prepared for any malfunction in the vehicle so as to immediately take over.
But technology may eventually result in vehicles programmed to not speed, tailgate, or fail to stop at red lights or stop signs and to identify pedestrians and other hazards on the road. When that happens, what responsibility, if any, will lie with the human occupant? Will the law still require a person to be capable of taking control of the car if a malfunction occurs, even if the likelihood of that occurring becomes far less than 1%? Will the technology allow persons to drink, text, read, eat, and sleep or otherwise be relieved of any and all responsibility for how the car operates while in an autonomous vehicle, since that seems to be the point?
But totally autonomous vehicles with approved total safety programming technologies are not quite here yet. Until then, primary liability for an accident caused by a malfunction should fall on the manufacturer though a driver could still be held at-fault if he or she failed to act in time once the malfunction became apparent if the driver still had some control over the car (category 1 or 2).
Both driver and manufacturer are required to have liability insurance at this time but when cars become fully autonomous, total liability should fall on the manufacturer. Insurance premiums may not be that expensive if predictions for how safe these vehicles are to be pan out. Further, although states are responsible for regulating its own motorists and vehicles, this could shift to the federal government since the NHTSA issues regulations on all new technologies that will have to be uniform and apply to all vehicles.
However, states will still be issuing its licenses to drivers of human-driven vehicles that will likely be around for decades to come. Observers seem to think that manufacturers will pay more in insurance but most future accidents should be the fault of humans who drive their own vehicles.
Although accidents with autonomous vehicles are few, they still occur from a defect or malfunction. Injury claimants should seek an autonomous car accident lawyer to advise them on liability issues and the responsible parties. It may still take independent witnesses to an accident, video recordings, an investigation of the scene, and an examination of the autonomous vehicle to uncover the alleged malfunction if the parties disagree on the accident cause. Also, your autonomous car accident lawyer will still have to prove your damages.
Damages in an Autonomous Vehicle Accident
Your damages are dependent on the nature and extent of your injuries. All must be proved by credible and sufficient supporting documentation and witness testimony to satisfy the standard of proof. Usual damages in a car accident consist of:
- Past and future medical expenses
- Past and future income loss
- Diminished or lost earning capacity
- Permanent disability
- Permanent disfigurement
- Pain and suffering
- Diminished enjoyment of life
- Emotional trauma
- Spousal loss of consortium
Retain the Law Office of Burns and Jain
Accidents with or by autonomous vehicles are rare since few of these vehicles are on our city streets and highways. This may well change in the next few years and the autonomous car accident lawyers at the Law Office of Burns and Jain will be there to assist you. Call our office for an in-depth analysis of any injury claim you may have.