em360tech image

Tesla has issued a recall of its fully self-driving software after US government regulators said the technology could allow vehicles to act unsafe and cause accidents on the road.

The electric car company said it is recalling certain Model S and Model X models equipped with Full Self-Driving (FSD) Beta software manufactured between 2016 and 2023, and those pending installment.

The recall, posted in a notice on the National Highway Traffic Safety Administration (NHTSA) website, affects as many as 362,758 vehicles equipped with the AI-wired software. 

“The FSD system may allow the vehicle to act unsafe around intersections, such as travelling through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal,” the notice from the NHTSA read.

Tesla sold 1.3 million vehicles last year, each of which came with a standard driver-assistance system branded autopilot that still requires human input. 

For an additional $15,000, owners can purchase FSD, which includes a number of AI-powered automated driving features but still requires the driver to be ready to take control of the vehicle at all times.  

CEO Elon Musk has long told investors that this technology will one day allow Tesla to deliver cars with fully autonomous driving capabilities it becoming more advanced. 

But the recently discovered problems in the AI-powered system indicate that such a day may be much further away than Musk once thought. 

Musk’s big conundrum 

The findings come as Tesla confronts declining growth in new-car deliveries and sales, slashing prices as demand no longer outpaces production. 

The safety issues with Tesla’s FSD system were discovered during the NHTSA’s engineering analysis of the technology. 

The regulator said it found that in certain situations, Tesla’s auto steering led to unreasonable risks to safety based on insufficient adherence to traffic safety laws

Tesla did not agree with this analysis, but “decided to administer a voluntary recall out of an abundance of caution,” according to the notice. 

The tech firm plans to release a free, over-the-air software update to fix the problem. This update “will improve how FSD beta negotiates certain driving manoeuvres,” the notice said.

Musk did not respond directly to the decision but said in a Tweet on Thursday that the media’s use of the word “recall” for Tesla’s update was “anachronistic and just flat wrong!”

The billionaire told reporters in Brazil last May that he believed Tesla would have self-driving cars without the need for human drivers behind the wheel for supervision around May 2023. 

“The Investigation Remains Open and Active”

The NHTSA said that the recall does not address the full scope of its investigation into Tesla’s autopilot and associated vehicle systems, which remains open and active. 

Tesla’s self-driving AI technology has long attracted scrutiny from regulators, who are concerned the technology does not yet meet road safety standards.

To read more about AI, visit our dedicated AI in the Enterprise Page. 

Richard Blumenthal and Ed Market, Democrat on the Senate transportation committee, praised the recall, stating that: “Tesla must finally stop overstating the real capabilities of its vehicles.”

“We have long warned that there are critical flaws with Tesla’s software, including the rolling stops feature, which puts the public at grave risk,” they explained in a statement

In January, the electric automaker disclosed in its 10-k form that the US Department of Justice (DoJ) requested documents related to its branded fully autonomous driving and autopilot advanced driver-assistance system.  

The form highlighted that "no government agency in any ongoing investigation has concluded that any wrongdoing occurred." 

But data released by the NHTSA last June showed Tesla vehicle motors were involved in over 70 per cent of accidents with cars equipped with advanced driver assistance systems.