Published on Mon Jul 24 2017

Per-instance Differential Privacy

Yu-Xiang Wang

We consider a refinement of differential privacy --- per instance privacy. We show that this is a strict generalization of the standard DP. An individual has stronger privacy if he/she has small "leverage score" with respect to the data set.

0
0
0
Abstract

We consider a refinement of differential privacy --- per instance differential privacy (pDP), which captures the privacy of a specific individual with respect to a fixed data set. We show that this is a strict generalization of the standard DP and inherits all its desirable properties, e.g., composition, invariance to side information and closedness to postprocessing, except that they all hold for every instance separately. When the data is drawn from a distribution, we show that per-instance DP implies generalization. Moreover, we provide explicit calculations of the per-instance DP for the output perturbation on a class of smooth learning problems. The result reveals an interesting and intuitive fact that an individual has stronger privacy if he/she has small "leverage score" with respect to the data set and if he/she can be predicted more accurately using the leave-one-out data set. Our simulation shows several orders-of-magnitude more favorable privacy and utility trade-off when we consider the privacy of only the users in the data set. In a case study on differentially private linear regression, provide a novel analysis of the One-Posterior-Sample (OPS) estimator and show that when the data set is well-conditioned it provides -pDP for any target individuals and matches the exact lower bound up to a multiplicative factor. We also demonstrate how we can use a "pDP to DP conversion" step to design AdaOPS which uses adaptive regularization to achieve the same results with -DP.

Wed Dec 24 2014
Machine Learning
Differential Privacy and Machine Learning: a Survey and Review
The objective of machine learning is to extract useful information from data, while privacy is preserved by concealing information. We explore the interplay between machine learning and differential privacy. We also describe some theoretical results that address what can be learned differentially privately.
0
0
0
Wed Mar 07 2018
Machine Learning
Revisiting differentially private linear regression: optimal and adaptive prediction & estimation in unbounded domain
We revisit the problem of linear regression under a differential privacy constraint. We propose simple modifications of two existing DP algorithms. Both AdaOPS and AdaSSP outperform the existing techniques on nearly all data sets.
0
0
0
Fri Nov 22 2019
Machine Learning
Privacy-preserving parametric inference: a case for robust statistics
0
0
0
Sat May 16 2020
Machine Learning
Near Instance-Optimality in Differential Privacy
0
0
0
Thu Sep 19 2019
Machine Learning
Differentially Private Regression and Classification with Sparse Gaussian Processes
A continuing challenge for machine learning is providing methods to perform computations on data while ensuring the data remains private. In this paper we build on the provable privacy guarantees of differential privacy which has been combined with Gaussian processes.
0
0
0
Thu Dec 05 2019
Machine Learning
Element Level Differential Privacy: The Right Granularity of Privacy
Differential Privacy (DP) provides strong guarantees on the risk of compromising a user's data in statistical learning applications. We propose element level differential privacy, which extends DP to provide protection against leaking information about any particular "element" a user has.
0
0
0