- Free Consultations / No Fees Until We Win
- (213) 927-3700
Personal Injury Firm
The cases against the automaker Tesla involving its Autopilot are reportedly growing, and fueling concerns over the technology’s shortcomings — especifically those related to the many car crashes it has been involved in. According to the New York Times, the incidents and the resulting lawsuits also call into question the development of similar systems used by rival carmakers.
The latest incident that’s now the subject of a lawsuit against the company involves Benjamin Maldonado and his teenage son. Driving back from a soccer tournament on a California freeway in August 2019, his Ford Explorer pickup was hit by a Tesla Model 3 that was traveling about 60 miles per hour on Autopilot. This system on the vehicle can steer, brake and accelerate a car on its own. A six-second video captured by the Tesla and data it recorded show that neither Autopilot nor the driver slowed the vehicle until a fraction of a second before the crash. His 15-year-old son, Jovani, who had been in the front passenger seat and not wearing his seatbelt, was thrown from the vehicle and died.
As the Times noted in its article, “as cars take on more tasks previously done by humans, the development of these systems could have major ramifications — not just for the drivers of those cars but for other motorists, pedestrians, and cyclists.”
Having built a brand and devoted fan base and customers around the creation of a new standard for electric vehicles, the crash incidents involving Autopilot could potentially threaten Tesla’s standing and force regulators to take action against the company.
The National Highway Traffic Safety Administration reportedly has about two dozen active investigations into crashes involving Autopilot. Since 2016, at least three Tesla drivers have died in crashes in which Autopilot was engaged and failed to detect obstacles in the road. In two instances, the system did not brake for tractor-trailers crossing highways. In the third, it failed to recognize a concrete barrier.
In June, the federal traffic safety agency released a list showing that at least 10 people have been killed in eight accidents involving Autopilot since 2016. However, that list did not include the crash that killed Jovani Maldonado.
Autopilot is not an autonomous driving system, but a suite of software that includes cameras and sensors intended to assist drivers and prevent accidents by taking over many aspects of driving a car — even the changing of lanes. Tesla executives have claimed that handing off these functions to computers will make driving safer because human drivers are prone to mistakes and distractions. While Autopilot is in control, drivers can relax, but are also supposed to to keep their hands on the steering wheel and eyes on the road, ready to take over in case the system becomes confused or fails to recognize objects or hazardous traffic scenarios.
Despite this, it seems drivers have been unable to resist the temptation to let their attention wander while Autopilot is on. The Times mentioned videos showing drivers reading or sleeping while at the wheel of Teslas having been posted on social media. Because of this, the company has often faulted the drivers of its cars, blaming them in some cases for failing to keep their hands on the steering wheel and eyes on the road while using Autopilot.
However, the National Transportation Safety Board, which has completed investigations into accidents involving Autopilot, has reportedly said the system lacks safeguards to prevent misuse and does not effectively monitor drivers. And while the National Highway Traffic Safety Administration has not forced Tesla to change or disable Autopilot, in June it said it would require all automakers to report accidents involving such systems.
Similar Autopilot systems offered by General Motors, Ford Motors, and other automakers use cameras to track a driver’s eyes and issue warnings when they look away from the road. After a few warnings, G.M.’s Super Cruise system shuts down and requires the driver to take control.
Earlier this year, Tesla reportedly told owners of some of its cars that a camera will begin to “detect and alert driver inattentiveness.” But auto safety experts reportedly said the cars only sometimes issue warnings and do not appear to dependably turn off Autopilot if drivers stay distracted. The system sometimes continues operating even if drivers have their hands on the steering wheel for only a few seconds at a time.
Despite all the problems with Autopilot, the company’s founder Elon Musk has reportedly said on multiple instances that Tesla is close to perfecting Full Self Driving — a technology that would allow cars to drive autonomously in most circumstances. Other auto and technology companies have reportedly said this is years away, though.