Tesla is recalling 362,000 US vehicles through Full Self-Driving software

WASHINGTON, Feb. 16 (Reuters) – Tesla Inc (TSLA.O) said it would recall 362,000 U.S. vehicles to update its Full Self-Driving (FSD) Beta software after U.S. regulators said on Thursday its driver assistance system was not complying sufficiently the traffic kept safety laws and can cause crashes.

The National Highway Traffic Safety Administration said the Tesla software allows a vehicle to “exceed speed limits or proceed through intersections in an illegal or unpredictable manner, increasing the risk of an accident.”

Tesla will release an over-the-air (OTA) software update for free, and the electric vehicle maker said it is not aware of any injuries or deaths that may be related to the recall issue. The automaker said it had 18 warranty claims.

Tesla shares fell 1.6% to $210.76 Thursday afternoon.

The recall affects 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with FSD Beta software or pending installation.

Latest updates

View 2 more stories

NHTSA asked Tesla to recall the vehicles, but the company said it disagreed with NHTSA’s analysis despite the recall. The move is a rare intervention by federal regulators in a real-world testing program the company sees as critical to the development of self-driving cars. FSD Beta is used by hundreds of thousands of Tesla customers.

The setback for Tesla’s automated driving efforts comes about two weeks before the company’s Investor Day on March 1, at which Chief Executive Elon Musk is expected to promote the EV maker’s artificial intelligence and plans to expand its vehicle lineup.

Tesla was not immediately available for comment.

NHTSA has an ongoing investigation opening in 2021 into 830,000 Tesla vehicles with the Autopilot driver assistance system following a series of crashes involving parked emergency vehicles. NHTSA is investigating whether Tesla vehicles are enough to make drivers pay attention. NHTSA said Thursday despite the FSD reminder that its “investigation into Tesla’s Autopilot and related vehicle systems remains open and active.”

Tesla said that in “certain rare circumstances … the feature could potentially violate local traffic laws or be used while performing certain driving maneuvers.”

Potential situations where the problem could occur include traveling or turning at certain intersections during a yellow traffic light and changing lanes from certain exit lanes to keep going straight, NHTSA said.

NHTSA said, “The system may not respond adequately to changes in posted speed limits or may not adequately account for the driver’s adjustment of the vehicle’s speed to exceed posted speed limits.”

Last year, Tesla recalled nearly 54,000 U.S. vehicles with FSD Beta software that allows some models to perform “rolling stops” and fail to come to a complete stop at some intersections, posing a safety risk, NHTSA said.

Tesla and NHTSA say FSD’s advanced driving features don’t make the cars autonomous and drivers should be attentive.

Reporting by David Shepardson in Washington Additional reporting by Joseph White in Detroit Edited by Ben Klayman, Peter Henderson and Matthew Lewis

Our Standards: The Thomson Reuters Principles of Trust.


Leave a Reply

Your email address will not be published. Required fields are marked *