Tesla crash lawsuit alleges “systematic fraud” on autopilot • The Register
Five Texas residents have filed a lawsuit against Tesla and a local restaurant after an alleged alcohol driver smashed a Model X into the back of two parked police cars.
The complaint [PDF] accuses the company of “flaws in Tesla’s security functions”, the functionality of which has been “largely and irresponsibly overrated” to “inflate Tesla’s share price and sell more cars”.
According to the complaint filed by the five police officers involved in the incident, the anonymous driver crashed his Tesla Model X into the back of two police cars parked at 70 mph (112 km / h) after stopping to investigate a fourth vehicle for suspected drug offenses. in February.
Although there were no fatalities, the lawsuit alleges the officers were “seriously injured” and require compensation for “the serious injuries and permanent disabilities they suffered as a result of the accident” when the parked vehicles were pushed towards “six people and a German Shepherd.”
Canine officer Kodiak “had to visit the vet” while the five officers and a civilian were taken to hospital. The stationed cruisers “were declared a total loss,” the lawsuit says.
“Even though autopilot was on at the time and police cars had flashing lights, the Tesla did not activate autopilot safety features to avoid the crash,” the combination continues. “The vehicle did not apply its ‘automatic emergency braking’ to slow down in order to avoid or mitigate the accident.”
The lawsuit named two defendants: Tesla, for publishing what would be an overexaggerated and faulty security system with a glaring blind spot for emergency vehicles with their flashing lights on; and Pappas Restaurants, on allegations that the Tesla driver had “consumed alcohol to the point where he was clearly intoxicated, and presented a clear danger to himself and to others” but ” the Cantina de Pappasito continued to serve him alcohol “.
Not named in the lawsuit, likely due to an inability to contribute significantly to the $ 20 million in combined damages sought by the plaintiffs, is the driver of the vehicle, who apparently made the decision to drive while he was allegedly drunk and failed to engage their own emergency braking system – their foot – before allegedly rushing into the back of parked vehicles.
The lawsuit claims the Model X, and by extension all of the company’s other vehicles, are faulty. “Tesla’s claims [about Autopilot and Automatic Emergency Braking] have been shown to be vastly and irresponsibly overestimated, if not outright false, ”the complainants allege, noting comments from both Tesla herself and CEO Elon Musk.
Musk had seemed react positively to the news that a couple had filmed having sex in their Tesla while driving under autopilot control in 2019 – which was predicted by Canadian auto safety expert Barrie Kirk three years earlier.
“Tesla is engaged in systematic fraud to pump the Tesla share price and sell more cars, while hiding behind information telling drivers that the system cannot be trusted,” the lawsuit alleges. “Tesla knows that Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves, which can result in serious injury or death.”
While it would seem fairer for some to blame the driver for the crash, the prosecution claimed the problem was widespread – highlighting 11 other cases of Tesla vehicles slamming the backs of parked emergency vehicles, apparently. because of the camera view. system being confused or even blinded by their flashing lights.
“It is inconceivable that the accused Tesla has not seen the publicly available reports of numerous accidents caused by his vehicles in connection with emergency vehicles equipped with flashing lights,” the lawsuit read. Tesla CEO even called one of Tesla’s driver assistance systems ‘not great.’ Accused Tesla, Inc. and company CEO Elon Musk were aware of many incidents regarding the “autopilot” system, but did not recall the cars and fixed the problem.
Tesla has intentionally decided not to address these issues and should be held responsible, especially when it has detailed knowledge of the risks and dangers associated with its autopilot system. Tesla admitted that its autopilot system does. sometimes fail to identify a stopped emergency.But yet, Tesla made the decision not to recall any of its vehicles knowing the autopilot system was faulty and posed an inherent risk of injury to the public, including first responders and Tesla drivers.
It’s a charge that the United States’ National Highway Traffic Safety Administration (NHTSA) Defect Investigation Office is currently engaged with. Earlier this year, he announced an investigation into the apparent trend of Tesla S, Y, X and 3 vehicles under autopilot or traffic-sensing cruise control modes to fail to identify and avoid emergency vehicles. parked with ‘vehicle lights, flares, a bright arrow. , and road cones. “
National Transportation Safety Board Chairman Robert Sumwalt, meanwhile, summed up his organization’s investigation into a fatal Model X crash in 2018 as presenting “system limitations” in autopilot while stating ” that it is time to stop allowing drivers of any partially automated vehicle to pretend they are driverless. Because they don’t have driverless cars.
In the face of such criticism, Musk’s usual defense is that Tesla’s autopilot system is “ten times safer” than a vehicle under manual control, with one accident recorded for 4.19 million kilometers driven by compared to the NHTSA average of one accident per 484,000 miles – which the lawsuit alleges compares “apples to oranges” due to the near-exclusive use of the autopilot while traveling on the freeway, while “a large percentage of crashes found in NHTSA data occur off-road.”
Also, when you consider that Musk excludes data where the autopilot was used immediately before a crash but was disengaged at some point before the crash, ”the lawsuit continues,“ Musk’s claims are not just unconvincing, they are misleading. “
The lawsuit seeks a jury trial against the two defendants and seeks $ 10 million for “real damages for pecuniary loss, loss of wages, loss of earning capacity, mental anguish and past, present and future medical costs” more Additional $ 10 m in exemplary damages. However, it does not detail the actual injuries of the complainants.
“The defendant Tesla, Inc. recommended, sold and distributed the autopilot system in issue. The product in issue was defective and unreasonably dangerous in terms of manufacture and marketing when it left the control of Tesla, Inc.”, concludes the lawsuit.
“The system in question has not performed safely, as an ordinary consumer would expect when using it in an intended and / or reasonably foreseeable manner. The risk of danger inherent in the design of the autopilot system outweighed the benefits of the design used. At all relevant times and at the time of the injury, it was reasonably foreseeable to the Respondent that the autopilot system would malfunction. “
Neither the plaintiffs’ attorney nor Tesla responded to a request for comment until the time of publication. ®