scispace - formally typeset
J

Jeremiah Blocki

Researcher at Purdue University

Publications -  117
Citations -  2052

Jeremiah Blocki is an academic researcher from Purdue University. The author has contributed to research in topics: Password & Computer science. The author has an hindex of 23, co-authored 103 publications receiving 1626 citations. Previous affiliations of Jeremiah Blocki include Microsoft & Carnegie Mellon University.

Papers
More filters
Proceedings Article

Locally Differentially Private Protocols for Frequency Estimation

TL;DR: This paper introduces a framework that generalizes several LDP protocols proposed in the literature and yields a simple and fast aggregation algorithm, whose accuracy can be precisely analyzed, resulting in two new protocols that provide better utility than protocols previously proposed.
Proceedings ArticleDOI

Differentially private data analysis of social networks via restricted sensitivity

TL;DR: The notion of restricted sensitivity was introduced in this paper as an alternative to global and smooth sensitivity to improve accuracy in differentially private data analysis, which is similar to that of global sensitivity except that instead of quantifying over all possible datasets, we take advantage of any beliefs about the dataset that a querier may have, to quantify over a restricted class of datasets.
Posted Content

The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy

TL;DR: This paper proves that an "old dog", namely - the classical Johnson-Lindenstrauss transform, "performs new tricks" - it gives a novel way of preserving differential privacy.
Proceedings ArticleDOI

The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy

TL;DR: In this article, the authors apply the Johnson-Linden Strauss transform to the problem of estimating the number of edges crossing a (S, \bar S)-cut in a graph.
Book ChapterDOI

Efficiently Computing Data-Independent Memory-Hard Functions

TL;DR: A new complexity measure capturing the amount of energy i.e. electricity required to compute a function is defined and argued that, in practice, this measure is at least as important as the more traditional AT-complexity.