What does the AI ​​plan mean for NHS patient data and is there cause for concern?

UK ministers have committed to setting up a national data library for building artificial intelligence models, as part of an AI action plan.

The library will contain state-audited data, compiling at least five “high-impact” public datasets. Prime Minister Keir Starmer indicated on Monday that patient data from the National Health Service could be part of the library.

Health data is a sensitive issue in an age of criminal hackers, cyber espionage by rogue states, and general concerns about the robustness of AI tools. Here we answer some questions about the potential use of NHS data.


What does the AI ​​action plan say about health data?

The plan, written by tech investor Matt Clifford, does not explicitly refer to health data but calls for a national data library that can be used by tech startups and researchers to train new models.

However, Starmer was more explicit on Monday about the use of NHS data, saying there was a “huge opportunity” to improve healthcare. He said: “I don’t think we should take a defensive stance here that will get in the way of the kind of breakthroughs we need.”

NHS trusts have already used patient data to develop AI models to predict conditions such as high blood pressure And eye diseases.


What are the concerns about the use of health data in the field of AI?

Personal health data is by nature highly sensitive and its vulnerability in a digital environment has already been highlighted by recent ransomware attacks that have hit NHS trusts.

Andrew Duncan, the director of fundamental AI at Britain’s Alan Turing Institute, says even anonymized health data can be manipulated to identify a patient through a process known as ‘re-identification’, where ‘anonymised’ data can be linked to other available data. information to identify someone.

“Once you start narrowing things down, you can easily re-identify people,” he says. Duncan adds that AI models can be trained in a way that prevents re-identification, although “the caveat is that this all has to be done very carefully.”

MedConfidential, which campaigns for confidentiality in healthcare, also wants clarity on whether a health dataset will respect patients who have signed an opt-out preventing their data being used for research and planning in England. Around 6% of NHS patients have signed the opt-out.


Is the data used for commercial purposes?

The plan states that public and private datasets will enable “innovation by UK startups”, indicating that private companies will have access to the material. Government officials have not ruled out using the data for profit purposes.

However, the plan makes it clear that ministers and officials must take into account issues such as “public trust, national security, privacy, ethics and data protection”.

In 2017, a partnership between the NHS and a private AI company clashed with the UK data regulator when it found that London’s Royal Free Hospital had failed to comply with data protection laws after it stole the personal data of 1.6 million patients had transferred to Google’s AI unit. , Deep Mind. The data transfer was part of a trial of a system for the diagnosis of acute kidney injury.


What could the data be used for?

In his speech on Monday, Starmer used the example of AI being deployed during a medical emergency last year to identify the exact location of a blood clot in the brain of a stroke victim. He said patient data could be used through AI to “predict and prevent” strokes in the future.

The AI ​​studies being carried out by NHS trusts also point to a range of applications, from predicting which patients are most likely to visit A&E to identifying people at risk of type 2 diabetes.


Anonymized data does not fall under the General Data Protection Regulation (GDPR), so using the data would be less legally problematic if it fell under that category.

If the data is not fully anonymised, the GDPR would apply, as would the common law duty of confidentiality, meaning the patient’s consent would be required to use it – although there is a public interest exception.

Kate Brimsted, partner at law firm Shoosmiths, said: “True and effective anonymisation means the UK GDPR would not apply. It would also overcome confidentiality restrictions. However, achieving robust anonymization is not an easy task; it is much more complex than just removing names and other obvious identifiers.”