A driver’s test for autonomous vehicles? A leading expert says US should have one

ANN ARBOR, Mich. — At a time of growing concern about the safety of self-driving vehicles, a leading expert is calling on the federal government to develop a national driving test that such vehicles must pass before being allowed to drive on public roads.

Such a regulation would set minimum standards to ensure that the vehicles demonstrate basic skills and competence in traffic situations in which their manufacturers intend to use them, said Henry Liu, who directs the University of Michigan’s Autonomous Vehicle Testing Center.

“Ensuring safety is important for consumers, for autonomous vehicle developers and also for the federal government,” Liu said in an interview. “The federal government has a responsibility to help set the minimum standard, to provide guidance on safety testing.”

Autonomous vehicles have been involved in a number of high-profile accidents in recent years, and studies have shown widespread public uncertainty about their safety. Successfully testing the vehicles’ ability to master a variety of traffic situations would boost public confidence in them, according to Liu.

Liu said much research is still needed before autonomous vehicles can be safely rolled out nationwide. But he said he agreed with their manufacturers that self-driving vehicles could potentially save lives and improve the efficiency of the nation’s transportation system in the long term.

Currently, there are no specific federal regulations for self-driving vehicles, and only a few states have their own such requirements. The National Highway Traffic Safety Administration, part of the Department of Transportation, has collected data on crashes involving autonomous vehicles. But so far it has only issued voluntary guidelines that do not include driving tests.

Messages seeking comment from the Transportation Department were left Tuesday.

Self-driving cars still must meet federal safety standards that apply to all passenger vehicles, meaning the government only investigates them after serious incidents.

“Our current vehicle safety regulations are reactive, so we rely on self-regulation,” Liu said.

At the University of Michigan’s testing center, Liu runs a fake town called Mcity, featuring a traffic light and a roundabout used by companies and the government to test self-driving vehicles.

A regulation, or perhaps a voluntary test, is needed because “we don’t want to create a public danger,” said Liu, who made his comments on Tuesday, announcing that Mcity can now be used by researchers remotely.

Liu suggested that a driving test should be able to determine whether a self-driving vehicle can turn left at an intersection without the protection of a traffic light with a green arrow. He said it should also ensure that the vehicle stops at a stop sign and detects and yields to a small pedestrian crossing the road.

A test, he said, would prevent an underperforming robotic vehicle from being unleashed on society, just as a test of a human driver would keep an incompetent driver off the road. But he acknowledged that no single test can prevent all accidents involving self-driving vehicles.

The driver tests, Liu said, would help developers of robotic vehicles “so that when they move to the US, in certain cities, they will face less resistance from the cities.”

Tesla CEO Elon Musk has long complained that federal regulations hinder innovation. Tesla is developing a robotaxi system called ‘Full Self-Driving’, but the robotaxi cannot drive itself, and Tesla owners who use it must be prepared to intervene at any time.

Liu said basic driving standards would actually contribute to innovation and improve the deployment of autonomous vehicles. If companies are confident enough to deploy their systems at scale, he said, a basic competency test should be a “small piece of cake” to pass.

“So why might this be a barrier to deployment?” he asked.

Europe and China, Liu noted, already have basic tests involving third-party testing of autonomous vehicles. But the US has continued to rely on corporate self-certification.

Liu said he is now taking action to propose the driving test as autonomous vehicles make progress in using machine learning computers to make decisions on the road. He predicts they will be widely deployed on American roads in five to 10 years.

“Large-scale implementation is on the horizon, which is why the federal government must take action,” Liu said.

Waymo, the autonomous vehicle unit of Alphabet Inc., is already transporting passengers in vehicles without human safety drivers in Phoenix and other areas. General Motors’ Cruise self-driving unit had been running robotaxis in San Francisco until a crash last year involving one of its vehicles.

Aurora Innovation also said it will begin hauling freight in fully autonomous semis on Texas highways by the end of the year. Another autonomous semi company, Gatik, plans to transport freight autonomously by the end of 2025.

Autonomous vehicle crashes in recent years include one involving an Uber self-driving SUV with a human backup driver that struck and killed an Arizona pedestrian in 2018, and an autonomous Cruise Chevrolet Bolt that pulled a pedestrian into the side of the road. causing serious injuries. The pedestrian was struck by a human-powered vehicle and ended up in the Bolt’s path.

Related Post