Tesla Autopilot System Death Toll Rises – The Detroit Bureau

0
15
Tesla Autopilot System Death Toll Rises - The Detroit Bureau


The death toll from accidents involving the failure of Tesla’s much maligned Autopilot semi-autonomous driving technology, has reached 17, according to a new data from the National Highway Traffic Safety Administration.

Tesla’s semi-autonomous Autopilot tech came under suspicion after this January 2018 crash.

The Washington Post reported it gleaned the rising numbers from reports compiled by NHTSA. Despite the rising number of fatalities, Tesla CEO Elon Musk continues to defend two technologies, Autopilot and Full Self-Driving, routinely prodding Tesla owners to use them. 

“There’ll be a little bit of two steps forward, one step back between releases for those trying the beta. But the trend is very clearly towards full self-driving, towards full autonomy. And I hesitate to say this, but I think we’ll do it this year. So that’s what it looks like. Yes,” he said during a conference call with analysts and investors back in April when asked about the status of FSD, which is FSD is the more advanced of the two systems.

Autopilot being scrutinized

However, Autopilot is at the center of an ongoing federal safety investigation. The Post, however, reported over the weekend there have 736 crashes and 17 fatalities in the U.S. since 2019 involving Teslas in Autopilot mode — far more than previously reported.

The figures come from the Post’s analysis of NHTSA data, which also showed Teslas were involved in the vast majority of the more than 800 accidents tallied in the report.

Model S fire truck crash California
Tesla’s two semi-autonomous driving technologies are or have been under investigation by state and federal authorities.

The number of these type of crashes has surged in the past four years, the data shows, reflecting the hazards associated with increasing use of Tesla’s driver-assistance technology as well as the growing presence of Tesla on the nation’s highways as the sales of the company’s electric vehicles have steadily increased, according to the Post.

“When authorities first released a partial accounting of accidents involving Autopilot in June 2022, they counted only three deaths definitively linked to the technology. The most recent data includes at least 17 fatal incidents, 11 of them since May 2022, and five serious injuries,” the Post reported.

Tesla, which has no media relations or public relations department, had no comment on the Post report.

Tesla ducks’ responsibility

However, the Tesla website does carry a disclaimer stating, “Current Autopilot features require active driver supervision and do not make the vehicle autonomous,” the company’s branding has been accused of misleading drivers of their vehicles’ capabilities. 

Tesla also recently prevailed in a lawsuit in which the plaintiff tried to blame the company’s Autopilot program for a 2019 crash. 

This Model X crash reveals the short comings of Autopilot and, potentially, other similarly named advanced driver assistance systems.

The jurors in the California case found the software wasn’t at fault in a crash where the car turned into a median on a city street while Autopilot was engaged. They basically upheld the legal precedent developed during the past century of motoring that any human driver is responsible for the operation of their vehicles.”

Critics argue Musk and Tesla, by choosing names like Autopilot and full self-driving gives drivers a false sense of security. Other automakers, which now offer similar technology such as General Motors, Ford and Mercedes-Benz, are careful to avoiding hyping the safety benefits of the driver assistance features, which allow the driver to remove their hands from the wheel under certain circumstances.

Musk has insisted cars using FSD are safer than those piloted solely by human drivers, citing crash rates when the modes of driving are compared, a claim made by no other automaker. 

But the Post found four of the fatal accidents involved a motorcycle, while another involved an emergency vehicle, which in theory, the system has been taught to avoid.

Musk also has repeatedly defended his decision to push driver-assistance technologies to Tesla owners, arguing that the benefit outweighs the harm.

Said Musk, “At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” Musk said last year. “Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they know.”

Last year, though, the “Dawn Project” bought a full-page ad in the New York Times that described “Full Self-Driving” software as “the worst software ever sold by a Fortune 500 company.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here