NIST Finalizes Guidelines for Evaluating ‘Differential Privacy’ Guarantees to De-Identify Data

nist

View as a Web Page

News

NIST Finalizes Guidelines for Evaluating ‘Differential Privacy’ Guarantees to De-Identify Data

Differential privacy: hands rest on a laptop keyboard beneath a floating overlay of computer code and connecting lines.

How can we glean useful insights from databases containing confidential information while protecting the privacy of the individuals whose data is contained within? Differential privacy, a way of defining privacy in a mathematically rigorous manner, can help strike this balance. Newly updated guidelines from the National Institute of Standards and Technology (NIST) are intended to assist organizations with making the most of differential privacy’s capabilities.

Differential privacy, or DP, is a privacy-enhancing technology used in data analytics. In recent years, it has been successfully deployed by large technology corporations and the U.S. Census Bureau. While it is a relatively mature technology, a lack of standards can create challenges for its effective use and adoption. For example, a DP software vendor may offer guarantees that if its software is used, it will be impossible to re-identify an individual whose data appears in the database. NIST’s new guidelines aim to help organizations understand and think more consistently about such claims.

Read More

IN CASE YOU MISSED IT

A person sitting with a laptop reaches out to touch a padlock icon floating in the air with other cybersecurity symbols.

NIST Finalizes Updated Guidelines for Protecting Sensitive Information

May 14, 2024
To do business with the federal government, contractors and other organizations are required to follow NIST guidelines for protecting the sensitive information they handle.

Read More