NIST drafts privacy protection guidance for AI-driven research
The US National Institute of Standards and Technology is in charge of the Biden administration's Executive Order on AI Offers draft guideline on the evaluation of data privacy protection for use with artificial intelligence on Monday.
WHY IT MATTERS
NIST announced the new guidance on differentiated privacy assurances, stating on its website that its goal is to help data-centric organizations balance privacy and accuracy.
“Differential privacy is one of the more mature privacy-enhancing technologies used in data analytics, but a lack of standards can make it difficult to deploy effectively, potentially creating a barrier for users,” the agency said in the paper . announcement.
In the explanation, NIST presented a “thorny situation”: health researchers would like to access consumer fitness tracker data to help improve medical diagnostics.
“How do researchers obtain useful and accurate information that can benefit society while keeping individual privacy intact?”
NIST said the Draft NIST Special Publication (SP) 800-226, Guidelines for Evaluating Different Privacy Guarantees is designed for federal agencies, as directed by the order, but it is also a tool for software developers, business owners, and policymakers to “understand and think more consistently about differential privacy claims.”
The algorithm was spawned as part of last year Pricing challenge for privacy-enhancing technologieswhich had a combined US-UK prize pool of $1.6 million for using federated learning to generate new cryptography to keep data encrypted during AI model training.
PETs can be used in new cryptography and to tackle money laundering and predict locations of public health emergencies. More than seventy solutions were subjected to Red Team attacks to see if the raw data could be protected.
“Privacy-enhancing technologies are the only way to solve the dilemma of how to harness the value of data while protecting people's privacy,” Arati Prabhakar, assistant to the president for science and technology and director of the Office of Science and Technology White House Technology Policy. , said at a White House in March announcement about the winners.
Differential privacy, which uses a central or multiple aggregators to add noise, says Damien Desfontaines, staff scientist at differential privacy company Tumult Labs, on his personal bloggingis still maturing.
There are risks, according to Naomi Lefkovitz, manager of NIST's Privacy Engineering Program and editor of the draft.
“We want this publication to help organizations evaluate different privacy products and get a better idea of whether the claims made by their makers are accurate,” she said in announcing the NIST guidelines.
While the competition revealed “that differential privacy is the best method we know of for providing robust privacy protection against attacks after the model has been trained,” Lefkovitz said.
“It won't prevent all types of attacks, but it can add a layer of defense.”
This places the responsibility on developers to assess whether there is a guarantee of privacy in practice.
To properly assess a claim of differential privacy, they must understand multiple factors, which NIST has identified and organized into a “differential privacy pyramid.” The top level contains the most direct measures in terms of privacy guarantees, while the middle level contains factors that can undermine different privacy guarantees and the bottom level consists of underlying factors such as the data collection process.
NIST is asking for public comment through January 25, 2024, and the final version is expected to be published later next year.
THE BIG TREND
Powerful AI models built with quantum computing expand the attack surface for organizations that store large amounts of data and protected data, such as healthcare. Any use of encrypted protected health information could be vulnerable to such an attack.
In September, NIST released design algorithms for quantum-resistant cryptography, soliciting feedback on standards for three algorithms designed to withstand quantum-powered cyberattacks that would occur just before Thanksgiving.
In the not-too-distant future, quantum computers could quickly crack binary encryption.
“The devil is in the details,” says Dan Draper, founder and CEO of CipherStash.
Protecting data that relies on public key cryptography for data security is of primary importance, he explained in a preview of data encryption trends in 2024.
“There are organizations that are capturing a lot of encrypted traffic – secure messages, secure Zoom calls,” to store now for future nefarious use.
Draper said Healthcare IT news last month that while the final NIST quantum-safe public key cryptography standards are still being finalized, “it looks promising” in their ability to defend against quantum attacks.
He also noted that despite the progress, there looms a Y2K-like race to update software for quantum security.
“We're going to have to rush to update that quickly,” he said.
ON THE RECORD
“We're showing the math involved, but we're trying to focus on making the document accessible,” Lefkovitz said in the announcement. “We don't want you to have to be a math whiz to use differential privacy effectively.”
Andrea Fox is editor-in-chief of Healthcare IT News.
Email: afox@himss.org
Healthcare IT News is a HIMSS Media publication.