getting around

Did Tesla Just Admit Its Self-Driving Feature Is Dangerous?

Photo: Justin Sullivan/Getty Images

On the recommendation of the National Highway Traffic Safety Administration, Tesla will recall 362,758 vehicles equipped with its self-driving feature known as Full Self Driving. In documents posted to its website on Thursday, NHTSA officials said the feature “may allow the vehicle to act unsafe around intersections,” including its ability to negotiate lane changes and turn at a “stale” yellow light, and “could increase the risk of a collision if the driver does not intervene.” NHTSA’s notice also said Tesla had received 18 warranty claims that could be related to the recall. Tesla’s representatives did not agree with this assessment, according to Bloomberg, but agreed to deploy a software update “out of an abundance of caution.” (Like many of Tesla’s recalls, the software update is the recall, and no cars will have to be physically returned.)

The recall is only small part of a wider investigation NHTSA initiated in 2021 to probe Tesla crashes that occurred while Full Self Driving mode was engaged. The investigation also includes the previous version of the company’s driver-assist software, known as Autopilot. NHTSA identified 273 crashes that involved Teslas in self-driving mode, and are currently investigating 41 of them, including 14 that resulted in fatalities. For years, safety officials, including state DMV leaders and a federal transportation advisor, have criticized the company’s self-driving features because they encourage users to test an unproven and unregulated autonomous-driving technology on public streets. Even the names “full self driving” and “autopilot” are not technically accurate, as Tesla’s own disclaimer says “the currently enabled features do not make the vehicle autonomous” and recommends drivers should be “prepared to take over at any moment.” But only in California is Tesla no longer allowed to use the term “full self driving.”

Videos recorded by users show Teslas in self-driving mode failing to stop and swerving into oncoming traffic, and drivers have reported that the malfunctioning software has caused crashes, including an eight-vehicle pileup in San Francisco last year. But this particular action taken by federal officials seems to have been prompted by a tweet (of course) in which a Tesla driver suggested that the “steering wheel nag” — a monitoring system that confirms drivers are paying attention by telling them to put their hands on the wheel — be removed for Full Self Driving beta testers who have driven over 10,000 miles. Elon Musk replied: “Agreed. Update coming in Jan.”

The recall impacts every vehicle that currently has the Full Self Driving beta installed — software that cost those Tesla owners $15,000. The company is also facing a class-action lawsuit from Full Self Driving customers who claim it doesn’t work as advertised. (Tesla’s lawyers argued it’s not fraud but “failure.”) Tesla will be required to address the problem with an over-the-air update by April 15. As of this writing, Musk has not yet made a comment through his public relations firm (Twitter).

Did Tesla Just Admit Its Self-Driving Feature Is Dangerous?