DETROIT– Tesla's recall of more than 2 million of its electric vehicles — an effort to help drivers using its Autopilot system pay closer attention to the road — relies on technology that research shows may not work as intended.
Tesla, the leading electric vehicle maker, reluctantly agreed to the recall last week after a two-year investigation by the U.S. National Highway Traffic Safety Administration found that Tesla's driver monitoring system was broken and needed a fix.
The system sends alerts to drivers if it fails to detect the torque of hands on the steering wheel, a system described by experts as ineffective.
Government documents filed by Tesla say the online software change will increase warnings for drivers to keep their hands on the wheel. It could also limit the areas where the most commonly used versions of Autopilot can be used, although that isn't entirely clear in Tesla's documents.
NHTSA began its investigation in 2021 after receiving 11 reports of Teslas using the partially automated system crashing into parked emergency vehicles. Since 2016, the agency has sent investigators to at least 35 crashes in which Teslas, suspected of running on a partially automated driving system, struck parked emergency vehicles, motorcyclists or tractor-trailers that crossed the vehicles' paths, resulting in a total of 17 deaths. .
But research from NHTSA, the National Transportation Safety Board and other researchers shows that simply measuring steering wheel torque does not guarantee that drivers are paying enough attention. Experts say night vision cameras are needed to monitor drivers' eyes and make sure they are looking at the road.
“I do have concerns about the solution,” said Jennifer Homendy, the chair of the NTSB, who investigated two fatal crashes in Florida involving Teslas on Autopilot, where neither the driver nor the system detected tractor-trailers crossing. “The technology, the way it worked, including with the steering torque, was not enough to keep the driver's attention and distract the driver.”
Additionally, the NHTSA study found that of the 43 crashes examined with detailed data, 37 drivers had their hands on the wheel in the last second before their vehicle crashed, indicating they were not paying enough attention.
“People are bad at monitoring automated systems and intervening when something goes wrong,” said Donald Slavik, an attorney for plaintiffs in three lawsuits against Tesla over Autopilot. “That is why the human factors studies have shown a significantly delayed response under these conditions.”
Missy Cummings, a professor of engineering and computer science at George Mason University who studies automated vehicles, said it is widely accepted by researchers that monitoring hands on the wheel is insufficient to keep a driver's attention on the road.
“It's a measure of attention and it's a bad measure of attention,” she said.
A better solution, experts say, would be to require Tesla to use cameras to monitor drivers' eyes to make sure they are looking at the road. Some Teslas have inward-facing cameras. But they don't see well at night, unlike those in General Motors or Ford's driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.
Koopman noted that older Teslas do not have such cameras.
Tesla's recall documents say nothing about the increased use of cameras. But the company's software release notes, posted on Tesla, which does not have a media relations department, did not answer email questions about the release notes or other recall issues.
Tesla's website states that Autopilot and more advanced “Full Self Driving” software cannot drive themselves and that drivers must be prepared to intervene.
Experts say that while limiting Autopilot's capabilities to controlled-access highways would help, it's unclear whether Tesla will do so with the recall.
In the recall documents it filed with NHTSA, Tesla says the standard Autopilot includes systems called Autosteer and Traffic Aware Cruise Control. The documents state that Autosteer is intended for use on highways and will not work if a driver activates it under the wrong conditions. The software update, the documents say, will have “additional controls when enabling Autosteer and while using the feature off controlled-access highways and when approaching traffic controls.”
Cummings noted that this doesn't specifically say that Tesla will limit areas where Autopilot can operate to limited-access highways — a concept known as “geofenced.”
“When they say conditions, it doesn't say geofence anywhere,” she said.
Kelly Funkhouser, associate director of vehicle technology for Consumer Reports, said she was able to use Autopilot on roads that were not controlled access roads while testing a Tesla Model S that received the software update. But it's difficult, she said, to test everything else in the recall because Tesla has been vague about what exactly is changing.
Homendy, the chairman of the Transportation Safety Board, said she hopes NHTSA has reviewed Tesla's solution to determine whether it does what the agency intended.
The NTSB, which can only make recommendations, will investigate whether it sees a problem with Teslas that received the recall repairs, Homendy said.
Veronica Morales, communications director for NHTSA, said the agency does not pre-approve recall solutions because federal law puts the burden on the automaker to develop and implement repairs. But she said the agency is keeping its investigation open and will monitor Tesla's software and hardware solutions to make sure they work, testing them at NHTSA's research and testing center in Ohio, where it has several Teslas available.
The agency only received the software update on its vehicles a few days ago and has yet to evaluate it, Morales said. The solution should also address crashes on all roads, including highways, the agency said.
Cummings, a former NHTSA special counsel who will be an expert witness for the plaintiff in an upcoming Florida lawsuit against Tesla, said she expects Tesla's warnings will deter a small number of drivers from abusing Autopilot. But Tesla's problems, Cummings said, won't end until it limits the system's capabilities and fixes its computer vision system so it can better detect obstacles.