The King has presented the first bills and legislation from the newly elected Labour government to the British Parliament, including several pieces relating to technology.
Among them, the king stated that his government “will try to establish appropriate legislation to impose requirements on those working on developing the most powerful artificial intelligence models.”
The government also wants to introduce a Cyber Security and Resilience Act to strengthen the cyber resilience of government agencies such as the Ministry of Defence (MoD) and the NHS.
A step towards regulation or suppression of innovation?
The response from experts to the announcement of both bills was mixed, but one point received consistent support. Public services in the UK rely on outdated and insecure IT systems to function, putting them and the public at greater risk of their services being disrupted and their data being leaked.
In January 2024, a report was published showing that MoD gets 11 ‘red-classified’ IT systems that are exposed to cyber attacks and breaches, or are simply too inefficient or unsuitable to use. In May, the MoD was reportedly hit by a Chinese state-sponsored cyber attack who stole personal information from current and former employees.
In the same month, a report found that British public has little confidence in NHS cybersecurity and data handlingwith almost half (49%) believing the service could mishandle their data, and more than four in five (82%) saying they are concerned in some way about cyberattacks on NHS systems. In June, Synnovis, a pathology provider for Guy’s and St Thomas’ NHS Foundation Trust hit by a cyber attack who stole blood test data from patients.
Later publicly available information from the board meeting revealed that concerns had been raised about third-party service providerswith directors indicating that an IT modernisation programme was needed to increase the NHS’s cyber resilience.
Commenting on the unveiling of the Cybersecurity and Resilience Bill, Dominic Trott, Director of Strategy and Alliances at Orange Cyberdefense said: “Any steps to further strengthen our defences and ensure more essential digital services than ever before are to be welcomed. Over the past year, we have seen a series of attacks on organisations that provide critical services to the UK. In the healthcare sector, for example, the pressures facing hospitals have been exacerbated by the growing threat from cybercriminals who have brazenly targeted the critical systems of the most vulnerable.”
“According to our own data, there were 69 cyber extortion attacks on healthcare companies in the first quarter of this year, an increase of more than 100% compared to the first quarter of 2023. To combat this, organisations must optimise access to skills, implement appropriate processes and use technology appropriately to achieve cyber resilience. It is encouraging to see that the law will make updates to the old regulatory framework by expanding the scope of the regulation to protect supply chains, which are an increasingly important threat vector for attackers,” Trott concluded.
AI growth
As for the AI bill, experts are concerned that if passed too quickly, the bill could impose overly strict regulations on a technology that has already shown the potential to improve productivity and efficiency. They also fear that the legislation could unfairly stifle innovation.
David Shepherd, Senior Vice President of EMEA at Ivanti, said: “The announcement of intentions to legislate on AI is a positive step, but the details matter. For regulation to truly succeed and drive innovation while ensuring safety, clear, transparent and globally consistent guardrails are crucial. Particularly when it comes to protecting workers – a key focus of the new Labour government.”
“While regulation cannot be rushed, timely action is essential,” he added. “Delays could lead to an increase in AI bias and ethical issues such as potential job losses, a concern the new administration clearly wants to address.”
“As Labour’s plans unfold, concrete regulatory detail will be essential. To ensure that no group is disproportionately affected, the development process must include diverse views to ensure that AI is set up for businesses and workers to thrive harmoniously. This commitment will be critical to the success of the AI Bill and ultimately to the safety of workers and the overall success of business.”