scispace - formally typeset
Open AccessBook ChapterDOI

Our data, ourselves: privacy via distributed noise generation

TLDR
In this paper, a distributed protocol for generating shares of random noise, secure against malicious participants, was proposed, where the purpose of the noise generation is to create a distributed implementation of the privacy-preserving statistical databases described in recent papers.
Abstract
In this work we provide efficient distributed protocols for generating shares of random noise, secure against malicious participants. The purpose of the noise generation is to create a distributed implementation of the privacy-preserving statistical databases described in recent papers [14,4,13]. In these databases, privacy is obtained by perturbing the true answer to a database query by the addition of a small amount of Gaussian or exponentially distributed random noise. The computational power of even a simple form of these databases, when the query is just of the form ∑if(di), that is, the sum over all rows i in the database of a function f applied to the data in row i, has been demonstrated in [4]. A distributed implementation eliminates the need for a trusted database administrator. The results for noise generation are of independent interest. The generation of Gaussian noise introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches (reduced by a factor of n). The generation of exponentially distributed noise uses two shallow circuits: one for generating many arbitrarily but identically biased coins at an amortized cost of two unbiased random bits apiece, independent of the bias, and the other to combine bits of appropriate biases to obtain an exponential distribution.

read more

Citations
More filters
Posted Content

Lower Bounds for Locally Private Estimation via Communication Complexity

TL;DR: Lower bounds for estimation under local privacy constraints are developed by showing an equivalence between private estimation and communication-restricted estimation problems, and it is shown that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{ d}{ \min\{\varepsilon, \varePSilon^2\}}$.
Journal ArticleDOI

A Survey on Differentially Private Machine Learning [Review Article]

TL;DR: This work provides a comprehensive survey on the existing works that incorporate differential privacy with machine learning, so- called differentially private machine learning and categorizes them into two broad categories as per different differential privacy mechanisms: the Laplace/ Gaussian/exponential mechanism and the output/objective perturbation mechanism.
Proceedings Article

Differentially private subspace clustering

TL;DR: This work builds on the framework of differential privacy and presents two provably private subspace clustering algorithms and demonstrates via both theory and experiments that one of the presented methods enjoys formal privacy and utility guarantees and asymptotically preserves differential privacy while having good performance in practice.
Posted Content

Extremal Mechanisms for Local Differential Privacy

TL;DR: In this paper, the authors studied the fundamental trade-off between local differential privacy and utility, and showed that for any utility function and any privacy level, the privacy-utility maximization problem is equivalent to solving a finite-dimensional linear program, the outcome of which is the optimal staircase mechanism.
Posted Content

Private Empirical Risk Minimization, Revisited.

TL;DR: This paper provides new algorithms and matching lower bounds for private ERM assuming only that each data point’s contribution to the loss function is Lipschitz bounded and that the domain of optimization is bounded, and implies that algorithms from previous work can be used to obtain optimal error rates.
References
More filters
Book ChapterDOI

Calibrating noise to sensitivity in private data analysis

TL;DR: In this article, the authors show that for several particular applications substantially less noise is needed than was previously understood to be the case, and also show the separation results showing the increased value of interactive sanitization mechanisms over non-interactive.
Journal ArticleDOI

The Byzantine Generals Problem

TL;DR: The Albanian Generals Problem as mentioned in this paper is a generalization of Dijkstra's dining philosophers problem, where two generals have to come to a common agreement on whether to attack or retreat, but can communicate only by sending messengers who might never arrive.
Book ChapterDOI

The Byzantine generals problem

TL;DR: In this article, a group of generals of the Byzantine army camped with their troops around an enemy city are shown to agree upon a common battle plan using only oral messages, if and only if more than two-thirds of the generals are loyal; so a single traitor can confound two loyal generals.
Journal Article

Calibrating noise to sensitivity in private data analysis

TL;DR: The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Proceedings ArticleDOI

How to play ANY mental game

TL;DR: This work presents a polynomial-time algorithm that, given as a input the description of a game with incomplete information and any number of players, produces a protocol for playing the game that leaks no partial information, provided the majority of the players is honest.