Jump to content
  • Sign Up
×
×
  • Create New...

NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths


Recommended Posts

  • Diamond Member



NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths

The National Highway Traffic Safety Administration (NHTSA)

This is the hidden content, please
into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes, including 13 fatal incidents that led to 14 deaths. The organization has ruled that these accidents were due to driver misuse of the system.

However, the NHTSA also found that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.” In other words, the software didn’t prioritize driver attentiveness. Riders using Autopilot or the company’s Full Self-Driving technology “were not sufficiently engaged,” because Tesla “did not adequately ensure that drivers maintained their attention on the driving task.”

The organization investigated nearly 1,000 crashes from January of 2018 until August of 2023, accounting for 29 total deaths. The NHTSA found that there was “insufficient data to make an assessment” for around half (489) of these crashes. In some incidents, the other party was at fault or the Tesla drivers weren’t using the Autopilot system.

The most serious were 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path” and these were often linked to Autopilot or FSD. These incidents led to 14 deaths and 49 serious injuries. The agency found that drivers had enough time to react, but didn’t, in 78 of these incidents. These drivers ******* to brake or steer to avoid the hazard, despite having at least five seconds to make a move.

That’s where complaints against the software come into play. The NHTSA says that drivers would simply become too complacent, assuming that the system would handle any hazards. When it came time to react, it was too late. “Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and ****** circumstances,” the organization wrote. The imbalance between driver expectation and the operating capabilities of Autopilot resulted in a “critical safety gap” that led to “foreseeable misuse and avoidable crashes.”

The NHTSA also took umbrage with the branding of Autopilot, calling it misleading and suggesting that it lets drivers assume the software has total control. To that end, rival companies tend to use branding with words like “driver assist.” Autopilot indicates, well, an autonomous pilot. California’s attorney general and the state’s Department of Motor Vehicles are also investigating Tesla for misleading branding and marketing.

Tesla, on the other hand, says that it warns customers that they need to pay attention while using Autopilot and FSD,

This is the hidden content, please
. The company says the software features regular indicators that remind drivers to keep their hands on the wheels and eyes on the road. The NHTSA and other safety groups have said that these warnings do not go far enough and were “insufficient to prevent misuse.” Despite these statements by safety groups, CEO Elon Musk recently promised that the
This is the hidden content, please
“****** to the wall for autonomy.”

The findings could only represent a small fraction of the actual number of crashes and accidents related to Autopilot and FSD. The NHTSA indicated that “gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes.” This means that Tesla only receives data from certain types of crashes, with the NHTSA claiming the company collects data on around 18 percent of crashes reported to police.

With all of this mind, the organization has

This is the hidden content, please
into Tesla. This one looks into a recent gaming platforms software fix issued in December after two million vehicles were recalled. The NHTSA will evaluate whether the Autopilot recall fix that Tesla implemented is effective enough.





This is the hidden content, please

news, gear, autopilot, tesla, fsd, elon musk
#NHTSA #concludes #Tesla #Autopilot #investigation #linking #system #deaths

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.