Tesla’s Autopilot system is coming under fire again for its safety record – here’s why

Despite recent legal action in favor of Tesla's Autopilot semi-autonomous driving feature, a new report has revealed deeper problems with driver irresponsibility.

In previous court proceedings, the jury agreed that ultimate responsibility for Tesla's Autopilot and Full Self Driving Beta technology lay with the human behind the wheel.

However, The Washington Post recently obtained exclusive footage of a Tesla crashing through a T-intersection at around 75mph and crashing into a parked vehicle, killing one of the occupants and seriously injuring another in 2019.

According to The Post, it said that in exclusive police body camera footage obtained by the outlet, the shocked driver says he was “driving” and took his eyes off the road when he dropped his phone.

While the most obvious problem is driver inattention, The Post would also like to point out that this incident is one of the few fatal or serious accidents involving Tesla Autopilot in road scenarios where it is not intended to be used .

According to research conducted by The Post, Tesla has acknowledged that Autosteer, Autopilot's key feature, is “intended for use on controlled-access highways” with “a center divider, clear lane markings and no cross traffic.” This has been noted in manuals, legal documents, and even communications with federal regulators.

In fact, it is “extremely unlikely that using Autosteer will work as intended” when used when “visibility is poor (heavy rain, snow, fog, etc.) or when weather conditions interfere with sensor operation”, thus Tesla's website.

Hills, curved or “excessively rough” roads, bright lighting conditions and toll booths can also have the same negative effects on technology.

Even if the company has the technical capabilities To limit Autopilot's availability by region, the company has taken some definitive steps to limit use of the software. The mail claims.

Whistleblower expresses more disdain for Tesla's technology

(Image credit: Tesla)

In addition to The Post's independent findings, a former Tesla employee recently opened up to the BBC and stated that he doesn't believe the Autopilot hardware or software is ready yet.

Lukasz Krupski leaked data, including customer complaints about Tesla's braking and self-driving software, to German newspaper Handelsblatt in May. He was ignored by his employer and has since turned to the press.

According to the BBC, Mr Krupski said he had found evidence in company records showing that requirements regarding the safe operation of vehicles with a certain level of autonomous or assistive driving technology had not been met.

“It affects us all because we are essentially experiments on public roads. So even if you don't have a Tesla, your kids are still walking on the sidewalk,” he said. the BBC.

According to reporting from The Post, the US National Highway Traffic Safety Administration (NHTSA) has yet to take action on Tesla's autopilot shortcomings, despite strongly worded advice from its counterpart agency National Transportation Safety Board (NTSB). .

In an interview with The Post earlier this year, NTSB Chairman Jennifer Homendy said, “If the manufacturer isn't going to take safety seriously, it's up to the federal government to make sure they stand up for others to ensure safety.”

“Safety doesn't seem to be the priority when it comes to Tesla,” she added.

You might like it too

Related Post