Published on Thu Apr 16 2020

Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity

Haim Kaplan, Yishay Mansour, Uri Stemmer, Eliad Tsfadia

We present a differentially private learner for halfspaces over a finite grid. The building block for our learner is a new algorithm for approximately solving the linear constraints problem.

0
0
0
Abstract

We present a differentially private learner for halfspaces over a finite grid in with sample complexity $\approx d^{2.5}\cdot 2^{\log^*|G|}$, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of linear constraints of the form , the task is to privately identify a solution that satisfies most of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution .

Wed Feb 27 2019
Artificial Intelligence
Private Center Points and Learning of Halfspaces
We present a private learner for halfspaces over an arbitrary finite domain. The building block for this learner is a differentially private algorithm for locating an approximate center point. We also provide a lower bound on the sample complexity for privately finding a point in the convex hull.
0
0
0
Sun Feb 24 2019
Machine Learning
Efficient Private Algorithms for Learning Large-Margin Halfspaces
The sample complexity of our algorithms depends only on the margin of the data, and not on the dimension. We complement our results with a lower bound, showing that the dependence of our upper bounds on the margin is optimal.
0
0
0
Wed Feb 13 2019
Machine Learning
Differentially Private Learning of Geometric Concepts
We present differentially private efficient algorithms for learning union of polygons in the plane. Our algorithms achieve -PAC learning and $(\epsilon,\delta) $-differential privacy.
0
0
0
Fri Oct 25 2019
Machine Learning
Limits of Private Learning with Access to Public Data
The goal is to design a learning algorithm thatatisfies differential privacy only with respect to the private examples. We show that any hypothesis class of VC-dimension can be agnostically learned up to an excess error of
0
0
0
Mon Feb 10 2014
Machine Learning
Characterizing the Sample Complexity of Private Learners
In 2008, Kasiviswanathan et al. defined private learning as a combination of PAC learning and differential privacy. We give a combinatorial characterization of the sample size sufficient and necessary to privately learn a class of concepts. We show that any private learning algorithm with sample complexity m implies RepDim(C)
0
0
0
Thu May 30 2019
Machine Learning
Private Hypothesis Selection
We provide a differentially private algorithm for hypothesis selection. We apply our hypothesis selection algorithm to give learning algorithms for a number of natural distribution classes. We describe an application to private distribution-free PAC learning.
0
0
0