US to investigate Tesla’s camera-only self-driving feature in 2.4mn EVs following 4 low visibility collisions

US to investigate Tesla’s camera-only self-driving feature in 2.4mn EVs following 4 low visibility collisions

FP Staff October 21, 2024, 10:41:36 IST

The investigation will focus on whether Tesla’s FSD, which uses a camera-only system, is failing under low-visibility conditions, such as sun glare, fog, or dust clouds. Tesla started removing its radar-based Autopilot system in 2021, and started removing ultrasonic sensors in 2022

Advertisement
US to investigate Tesla’s camera-only self-driving feature in 2.4mn EVs following 4 low visibility collisions
Tesla has faced legal and regulatory challenges in the past over its autonomous driving features. The FSD software was implicated in at least two fatal accidents. Image Credit: Reuters

The US National Highway Traffic Safety Administration (NHTSA) has opened an investigation into Tesla’s Full Self-Driving (FSD) software, and will be focusing on over 2.4 million vehicles that the EV giant has already sold, after four reported crashes which were caused by low visibility. This includes one fatal incident.

The investigation will focus on whether Tesla’s FSD, which uses a camera-only system, is failing under low-visibility conditions, such as sun glare, fog, or dust clouds.

STORY CONTINUES BELOW THIS AD

The NHTSA’s investigation will focus on Tesla’s 2016-2024 Model S and Model X, 2017-2024 Model 3, 2020-2024 Model Y, and the recently launched Cybertruck.

According to the NHTSA, one of the crashes resulted in a pedestrian fatality, while another caused injuries. The inquiry is still in its early stages, but if the vehicles are found to pose a safety risk, a recall could follow.

Tesla’s Full Self-Driving software, as described on the company’s website, requires active driver supervision and does not make the vehicle fully autonomous. However, the NHTSA is now reviewing whether the system’s engineering adequately accounts for reduced visibility on the road.

Investigators are seeking more information about similar incidents involving FSD and whether Tesla has modified the software to improve its performance in challenging weather conditions.

This scrutiny comes at a crucial time for Tesla, with CEO Elon Musk doubling down on autonomous technology and robotaxis amid growing competition and slowing demand for electric cars.

Last week, Tesla revealed a new “Cybercab” concept—an AI-powered, two-seater robotaxi without a steering wheel or pedals. However, the vehicle would need approval from regulators before it could be deployed.

Tesla has faced legal and regulatory challenges in the past over its autonomous driving features. The FSD software was implicated in at least two fatal accidents, including one earlier this year in Seattle, where a motorcyclist was killed by a Model S operating in self-driving mode.

STORY CONTINUES BELOW THIS AD

Some industry experts have raised concerns about Tesla’s decision to rely solely on cameras for its autonomous systems, unlike competitors who use more advanced sensors like radar and lidar to enhance navigation in low-visibility conditions.

The NHTSA previously investigated Tesla’s Autopilot system, leading to a large-scale recall of over two million vehicles in December 2023. This new inquiry will assess whether Tesla’s recent updates to FSD are effective or if further action is required to ensure safety on the road.

With self-driving technology now in the spotlight, the outcome of this investigation could play a pivotal role in shaping Tesla’s future in autonomous mobility. The company has yet to comment on the matter, though its shares dipped slightly following the announcement.

End of Article
Latest News
Find us on YouTube
Subscribe
End of Article

Top Shows

Vantage Firstpost America Firstpost Africa First Sports