Free Consultation:212-766-5222

Leav and Steinberg Team

Self Driving Cars-Autopilot assist vehicles and the legal liability discussion

As an attorney representing car accident victims quite often, the recent news of two Tesla vehicles that crashed while in self-driving or autonomous driving mode raised a lot of questions.  From the attorney perspective, we are taught to evaluate an accident applying two major elements.  The first element is negligence, a two prong question.  First, d id the operator of a motor vehicle, operate his or her car in a way that was unreasonable under the circumstances.  The failure to operate your car in a reasonable manner is the first element of negligence.  The second element is proving that that failure, was a substantial factor or proximate cause of the accident.  The lines are blurred when we consider modern technology.  We must now consider whether the vehicle itself was negligent.

Tesla advises all of its owners that it’s autopilot feature DOES NOT mean that the vehicle is operating itself.  However, all marketing and news stories are describing the benefits of placing the car in autopilot mode and allowing the car and its sensors to sense the road and vehicles and obstructions around it.  Given this dichotomy, one must ask if the warning given is sufficient to free Tesla of any responsibility.

Joshua Brown’s death was the first reported death while operating a vehicle in autopilot mode.  This month, his vehicle collided with an 18 wheeler when the autopilot feature and the surrounding sun glare did not allow the vehicle’s autopilot features to react in a reasonable and timely manner.  The National Transportation Safety Board is investigating this accident with a focus on fault of the vehicle’s autopilot features.

It appears, Teslaw owner, Elon Musk has anticipated this for  several years telling drivers to keep their hands on the wheel because they will be accountable if the car’s on Autopilot crash.  Like most navigation systems, or other car features, buyers must activate the Autopilot software, which requires, when signing in, to acknowledge the technology is a beta platform and isn’t meant to be used as a substitute for the driver. When the NTSB began investigating this recent accident, Tesla reiterated their position stating: “Autopilot is an assist feature. You need to maintain control and responsibility of your vehicle.”

As an attorney, such statements seem to be feigned effort at best to protect its own company from legal liability.  Certainly a driver has to understand the features are ever evolving  But given the marketing tactics and the recurrent advertisements that the autopilot feature allows one to have the vehicle take over the responsibility of driving, where will the line be drawn as to whether the driver or the car itself is responsible.  I suggest, it should be a hybrid approach and that given the individual cases and accidents, it is certainly possible to argue that the car manufacturer of a self-driving assisted vehicle can and should bear responsibility.

Just as with many so-called smart features such as anti-lock brakes and electronic stability control, telling drivers Autopilot in a fine print or a three second pre-activate warning, might not prevent an accident won’t help Tesla in court if the technology is found to be defective. Simply giving warnings are no excuse for a design problem.

In addition to the Florida fatality, several accidents around the country have been reported.  In one, a Pennsylvania accident,Tesla has reported that the autopilot feature had not been turned on despite the driver believing he had activated the feature.  In a Montana accident, the driver claims the car lost control and hit a guardrail.  There Tesla stated that evidence shows that the driver had not had his hands on the steering wheel for more than two minutes.  Though I appreciate, defending a position, if the company is going to market their car as having an autopilot feature and encourage its use, how can they then defent the car’s failure on claiming the driver let the autopilot feature operate for too long. 

Recently, Consumer Reports last Thursday called on Tesla to disable Autopilot on more than 70,000 vehicles. “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” said Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports.

Hopefully states will continue to develop and pass laws that protect all parties involved.  However, as safety of an individual is paramount, we must make sure that the advancement of technology is not pushed so quickly as to sacrifice the safety of those on the road.