What is Differential Privacy?
Differential privacy is a mathematical framework that adds calibrated noise to data or query results, enabling statistical analysis while providing provable privacy guarantees for individuals.
Differential privacy is a rigorous mathematical definition of privacy that ensures the inclusion or exclusion of any individual record does not significantly affect the output of a data analysis. This is achieved by adding carefully calibrated random noise to query results or data before release.
Differential privacy provides provable privacy guarantees quantified by an epsilon parameter—lower epsilon values provide stronger privacy but reduce data utility. It is increasingly used by technology companies and government agencies for census data, analytics, and machine learning where aggregate insights are needed without individual identification.
Relevant Regulations
How IQWorks Helps
Related Terms
Data Anonymization
Anonymization irreversibly transforms personal data so that individuals can no longer be identified, even by the data controller, removing the data from privacy regulation scope.
Synthetic Data
Synthetic data is artificially generated data that statistically resembles real data but contains no actual personal information, useful for testing, development, and analytics.
Privacy-Enhancing Technologies (PETs)
PETs are technologies designed to protect personal data privacy while enabling data processing, analysis, and sharing for legitimate purposes.