US to probe Tesla’s ‘Full Self-Driving’ system after pedestrian killed

DETROIT– The US government’s highway safety agency is once again investigating Teslas “Full Self-Driving” systemthis time after reports of accidents in poor visibility, including one in which a pedestrian was killed.

The National Highway Safety Administration said in documents it opened the investigation Thursday, with the company reporting four crashes after Teslas entered areas of poor visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash resulted in injuries, the agency said.

Researchers will investigate whether ‘Full Self-Driving’ is able to ‘detect and appropriately respond to reduced visibility conditions on the road, and if so, what conditions caused these accidents.’

The investigation covers approximately 2.4 million Teslas from model years 2016 to 2024.

A message seeking comment was left early Friday from Tesla, which has repeatedly said the system cannot drive itself and that human drivers must be ready to intervene at all times.

Last week Tesla held an event a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. CEO Elon Musk said the company plans to have fully autonomous vehicles without human drivers next year, and have robotaxis available by 2026.

The agency also said it would investigate whether other similar crashes involving “Full Self-Driving” have occurred in low visibility conditions, and that it will seek information from the company on whether any updates have affected the system’s performance under those conditions. influenced.

“Specifically, his review will assess the timing, purpose and capabilities of such updates, as well as Telsa’s assessment of their security impact,” the documents said.

Tesla has recalled ‘Full Self-Driving’ twice under pressure from the agency, which sought information from police and the company in July after a Tesla used the system hit and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to use stop signs at low speeds and because the system violated other traffic rules.

Critics have said Tesla’s system, which uses only cameras to detect hazards, does not have the right sensors to be fully self-driving. Almost all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or in poor visibility.