Criminals could hack into self-driving cars to launch terror attacks or frauds, insurers warn

Criminals could hack self-driving cars to commit terror attacks or carry out ‘cash for crash’ fraud, insurers warn

  • Motor Insurers’ Bureau (MIB) has expressed concern about possible use of AVs
  • It outlines fears that terrorist attacks are ‘much more common’

Insurers have warned that criminals could hack into self-driving cars and carry out terror attacks on them – adding that they fear terrorist attacks like the one on Westminster Bridge in 2017 are ‘much more common’.

The Motor Insurers’ Bureau (MIB) published a regulatory framework for automated vehicles, which raised concerns about the potential of automated driving systems.

This is being used as evidence, presented to the Select Committee on Transport, as it examines the regulations around self-driving cars before they are implemented on UK roads, and how these cars interact with pedestrians and other road users.

The warning comes as the Department of Transport was aiming for automated driving systems to appear on the roads in 2021, but they have pushed the date back to 2023 as they now hope these vehicles will be in service by 2025.

But now there are serious concerns that these drive systems could be hacked and used to carry out “cash for crash” fraud, or worse, terror attacks.

Khalid Masood drove a car into pedestrians on the pavement along the south side of Westminster Bridge and Bridge Street, London, on March 22, 2017, injuring more than 50 people – four of them fatally

While the Department of Transport aimed for them to hit the road in 2021, they have pushed the date back to 2023 and hope fully autonomous vehicles will be in service by 2025

While the Department of Transport aimed for them to hit the road in 2021, they have pushed the date back to 2023 and hope fully autonomous vehicles will be in service by 2025

The MIB – which compensates victims of accidents involving uninsured and undetected drivers under agreements with the DfT – reports its concerns in detail to the select committee.

The MIB’s submission raised concerns that self-driving vehicles could be hacked into indicating that a car was at the scene of a collision when it was not.

And other insurers warned that hacked vehicles could be controlled remotely and could lead to incidents like the 2017 Westminster Bridge attack.

In the horror incident, on March 22, 2017, a Hyundai 4×4 drove over Westminster Bridge, outside the Palace of Westminster in London, knocking down pedestrians.

Khalid Masood, a 52-year-old Briton, drove a car into pedestrians on the sidewalk along the south side of Westminster Bridge and Bridge Street, injuring more than 50 people – four of them fatally. He was then shot by an armed police officer and died on the spot.

For use in terrorist crimes, MIB says hacked self-driving vehicles may ‘terrorists to use it as a deadly weapon without endangering their own lives.

“This, in turn, could lead to a much higher number of horrific incidents, such as the 2017 terrorist attacks on London Bridge and Westminster Bridge.”

This is because ADS may be configured to allow remote control of a vehicle.

And in terms of “cash for crash” offenses, the MIB has outlined that hacked ADS can also be monitored externally.

The MIB wrote: ‘The MIB is deeply concerned about the possibility of the Automated Driving System (ADS) being hacked and reset to leave a false and misleading data trail.

“For example, in an accident scenario, a damaged ADS could leave data indicating that the vehicle was at the scene of an accident when in fact it was nowhere near anything, placing the blame on an innocent party.”

They added that there are several different fraud scenarios that this could apply to.

Meanwhile, remotely controlled automated vehicles could be used for fraud crimes, such as ‘cash for crash’, which could be either a staged accident where fraudsters crash their own vehicles, or an induced accident, where they target innocent motorists to make the mistake.” Director.

The MIB report outlined: ‘Remotely controlled AVs (autonomous vehicles) can also be used for fraud, such as “crash for cash” incidents, where the perpetrator sets up a traffic accident to make a fraudulent claim against the other party’s insurance.’

Following this, they have advised regulators to ensure the UK’s digital infrastructure is up to the challenges posed by AVs.