scispace - formally typeset
Open AccessBook ChapterDOI

Our data, ourselves: privacy via distributed noise generation

TLDR
In this paper, a distributed protocol for generating shares of random noise, secure against malicious participants, was proposed, where the purpose of the noise generation is to create a distributed implementation of the privacy-preserving statistical databases described in recent papers.
Abstract
In this work we provide efficient distributed protocols for generating shares of random noise, secure against malicious participants. The purpose of the noise generation is to create a distributed implementation of the privacy-preserving statistical databases described in recent papers [14,4,13]. In these databases, privacy is obtained by perturbing the true answer to a database query by the addition of a small amount of Gaussian or exponentially distributed random noise. The computational power of even a simple form of these databases, when the query is just of the form ∑if(di), that is, the sum over all rows i in the database of a function f applied to the data in row i, has been demonstrated in [4]. A distributed implementation eliminates the need for a trusted database administrator. The results for noise generation are of independent interest. The generation of Gaussian noise introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches (reduced by a factor of n). The generation of exponentially distributed noise uses two shallow circuits: one for generating many arbitrarily but identically biased coins at an amortized cost of two unbiased random bits apiece, independent of the bias, and the other to combine bits of appropriate biases to obtain an exponential distribution.

read more

Citations
More filters
Book

The Algorithmic Foundations of Differential Privacy

TL;DR: The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example.
Journal Article

Calibrating noise to sensitivity in private data analysis

TL;DR: The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Book ChapterDOI

Differential privacy: a survey of results

TL;DR: This survey recalls the definition of differential privacy and two basic techniques for achieving it, and shows some interesting applications of these techniques, presenting algorithms for three specific tasks and three general results on differentially private learning.
Proceedings ArticleDOI

Deep Learning with Differential Privacy

TL;DR: In this paper, the authors develop new algorithmic techniques for learning and a refined analysis of privacy costs within the framework of differential privacy, and demonstrate that they can train deep neural networks with nonconvex objectives, under a modest privacy budget, and at a manageable cost in software complexity, training efficiency, and model quality.
Proceedings ArticleDOI

Practical Secure Aggregation for Privacy-Preserving Machine Learning

TL;DR: In this paper, the authors proposed a secure aggregation of high-dimensional data for federated deep neural networks, which allows a server to compute the sum of large, user-held data vectors from mobile devices in a secure manner without learning each user's individual contribution.
References
More filters

On the Utility of Privacy-Preserving Histograms

TL;DR: This work develops a method for computing a privacy-preserving histogram sanitization of “round” distributions, such as the uniform distribution over a high-dimensional ball or sphere, and develops techniques for randomizing the histogram constructions both for the hypercube and the hypersphere.
Proceedings Article

On privacy-preserving histograms

TL;DR: The scope of the sanitizing techniques of Chawla et al. are extended to a broad and rich class of distributions, specifically, mixtures of high-dimensional balls, spheres, Gaussians, and other "nice" distributions, allowing us to approximate various quantities of interest in a privacy-preserving fashion.
Proceedings ArticleDOI

Distributed pseudo-random bit generators—a new way to speed-up shared coin tossing

TL;DR: A new paradigm for obtaining shared coins is introduced, and the construction of a D-PRBG in which this amortized cost (computation and communication) is significantly lower than the cost of any “from-scratch” shared coin generation protocol is constructed.