scispace - formally typeset
Search or ask a question

Showing papers by "Moni Naor published in 2010"


Proceedings ArticleDOI
05 Jun 2010
TL;DR: This work identifies the problem of maintaining a counter in a privacy preserving manner and shows its wide applicability to many different problems.
Abstract: Differential privacy is a recent notion of privacy tailored to privacy-preserving data analysis [11]. Up to this point, research on differentially private data analysis has focused on the setting of a trusted curator holding a large, static, data set; thus every computation is a "one-shot" object: there is no point in computing something twice, since the result will be unchanged, up to any randomness introduced for privacy. However, many applications of data analysis involve repeated computations, either because the entire goal is one of monitoring, e.g., of traffic conditions, search trends, or incidence of influenza, or because the goal is some kind of adaptive optimization, e.g., placement of data to minimize access costs. In these cases, the algorithm must permit continual observation of the system's state. We therefore initiate a study of differential privacy under continual observation. We identify the problem of maintaining a counter in a privacy preserving manner and show its wide applicability to many different problems.

675 citations


Book ChapterDOI
30 May 2010
TL;DR: The first public-key encryption scheme in the Bounded-Retrieval Model (BRM) was constructed in this article, where the adversary is allowed to learn arbitrary information about the decryption key, subject only to the constraint that the overall amount of leakage is bounded by at most l bits.
Abstract: We construct the first public-key encryption scheme in the Bounded-Retrieval Model (BRM), providing security against various forms of adversarial “key leakage” attacks. In this model, the adversary is allowed to learn arbitrary information about the decryption key, subject only to the constraint that the overall amount of “leakage” is bounded by at most l bits. The goal of the BRM is to design cryptographic schemes that can flexibly tolerate arbitrarily leakage bounds l (few bits or many Gigabytes), by only increasing the size of secret key proportionally, but keeping all the other parameters — including the size of the public key, ciphertext, encryption/decryption time, and the number of secret-key bits accessed during decryption — small and independent of l. As our main technical tool, we introduce the concept of an Identity-Based Hash Proof System (IB-HPS), which generalizes the notion of hash proof systems of Cramer and Shoup [CS02] to the identity-based setting. We give three different constructions of this primitive based on: (1) bilinear groups, (2) lattices, and (3) quadratic residuosity. As a result of independent interest, we show that an IB-HPS almost immediately yields an Identity-Based Encryption (IBE) scheme which is secure against (small) partial leakage of the target identity’s decryption key. As our main result, we use IB-HPS to construct public-key encryption (and IBE) schemes in the Bounded-Retrieval Model.

219 citations


Journal ArticleDOI
TL;DR: The goal is to design encryption schemes for mass distribution of data, which enable to deter users from leaking their personal keys, trace the identities of users whose keys were used to construct illegal decryption devices, and revoke these keys as to render the devices dysfunctional.
Abstract: Our goal is to design encryption schemes for mass distribution of data , which enable to (1) deter users from leaking their personal keys, (2) trace the identities of users whose keys were used to construct illegal decryption devices, and (3) revoke these keys as to render the devices dysfunctional. We start by designing an efficient revocation scheme, based on secret sharing. It can remove up to t parties, is secure against coalitions of up to t users, and is more efficient than previous schemes with the same properties. We then show how to enhance the revocation scheme with traitor tracing and self-enforcement properties. More precisely, how to construct schemes such that (1) each user’s personal key contains some sensitive information of that user (e.g., the user’s credit card number), in order to make users reluctant to disclose their keys. (2) An illegal decryption device discloses the identity of users that contributed keys to construct the device. And, (3) it is possible to revoke the keys of corrupt users. For the last point, it is important to be able to do so without publicly disclosing the sensitive information.

217 citations


Proceedings Article
01 Jan 2010
TL;DR: A study of pan-private algorithms, where each datum may be discarded immediately after processing, where these algorithms retain their privacy properties even if their internal state becomes visible to an adversary.
Abstract: Collectors of confidential data, such as governmental agencies, hospitals, or search engine providers, can be pressured to permit data to be used for purposes other than that for which they were collected. To support the data curators, we initiate a study of pan-private algorithms; roughly speaking, these algorithms retain their privacy properties even if their internal state becomes visible to an adversary. Our principal focus is on streaming algorithms, where each datum may be discarded immediately after processing.

168 citations


Journal ArticleDOI
01 Sep 2010
TL;DR: A general impossibility result is given showing that a natural formalization of Dalenius’ goal cannot be achieved if the database is useful, and a variant of the result threatens the privacy even of someone not in the database.
Abstract: In 1977 Tore Dalenius articulated a desideratum for statistical databases: nothing about an individual should be learnable from the database that cannot be learned without access to the database. We give a general impossibility result showing that a natural formalization of Dalenius’ goal cannot be achieved if the database is useful. The key obstacle is the side information that may be available to an adversary. Our results hold under very general conditions regarding the database, the notion of privacy violation, and the notion of utility. Contrary to intuition, a variant of the result threatens the privacy even of someone not in the database. This state of affairs motivated the notion of differential privacy [15, 16], a strong ad omnia privacy which, intuitively, captures the increased risk to one’s privacy incurred by participating in a database.

133 citations


Proceedings ArticleDOI
23 Oct 2010
TL;DR: In this article, the authors present a dynamic dictionary that uses only O(1 + o(1)) bits, where O(B) is the information-theoretic lower bound for representing a set of size n from a universe of size u.
Abstract: The performance of a dynamic dictionary is measured mainly by its update time, lookup time, and space consumption. In terms of update time and lookup time there are known constructions that guarantee constant-time operations in the worst case with high probability, and in terms of space consumption there are known constructions that use essentially optimal space. However, although the first analysis of a dynamic dictionary dates back more than 45 years ago (when Knuth analyzed linear probing in 1963), the trade-off between these aspects of performance is still not completely understood. In this paper we settle two fundamental open problems: \begin{itemize} \item We construct the first dynamic dictionary that enjoys the best of both worlds: it stores $\boldsymbol{n}$ elements using $\boldsymbol{(1 + \epsilon) n}$ memory words, and guarantees constant-time operations in the worst case with high probability. Specifically, for any \boldsymbol{\epsilon = \Omega ( (\log \log n / \log n)^{1/2} )}$ and for any sequence of polynomially many operations, with high probability over the randomness of the initialization phase, all operations are performed in constant time which is independent of $\boldsymbol{\epsilon}$. The construction is a two-level variant of cuckoo hashing, augmented with a ``backyard'' that handles a large fraction of the elements, together with a de-amortized perfect hashing scheme for eliminating the dependency on $\boldsymbol{\epsilon}$. \item We present a variant of the above construction that uses only $\boldsymbol{(1 + o(1))\B}$ bits, where $\boldsymbol{\B}$ is the information-theoretic lower bound for representing a set of size $\boldsymbol{n}$ taken from a universe of size $\boldsymbol{u}$, and guarantees constant-time operations in the worst case with high probability, as before. This problem was open even in the {\em amortized} setting. One of the main ingredients of our construction is a permutation-based variant of cuckoo hashing, which significantly improves the space consumption of cuckoo hashing when dealing with a rather small universe. \end{itemize}

92 citations


Journal ArticleDOI
TL;DR: It is shown that compressibility (say, of SAT) would have vast implications for cryptography, including constructions of one-way functions and collision resistant hash functions from any hard-on-average problem in $\mathcal{NP}$ and cryptanalysis of key agreement protocols in the “bounded storage model” when mixed with (time) complexity-based cryptography.
Abstract: We study compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of $\mathcal{NP}$ decision problems. We consider $\mathcal{NP}$ problems that have long instances but relatively short witnesses. The question is whether one can efficiently compress an instance and store a shorter representation that maintains the information of whether the original input is in the language or not. We want the length of the compressed instance to be polynomial in the length of the witness and polylog in the length of original input. Such compression enables succinctly storing instances until a future setting will allow solving them, either via a technological or algorithmic breakthrough or simply until enough time has elapsed. In this paper, we first develop the basic complexity theory of compression, including reducibility, completeness, and a stratification of $\mathcal{NP}$ with respect to compression. We then show that compressibility (say, of SAT) would have vast implications for cryptography, including constructions of one-way functions and collision resistant hash functions from any hard-on-average problem in $\mathcal{NP}$ and cryptanalysis of key agreement protocols in the “bounded storage model” when mixed with (time) complexity-based cryptography.

87 citations


Journal ArticleDOI
TL;DR: A new voting protocol with several desirable security properties, including “everlasting privacy”, which means a single corrupt authority will not cause voter privacy to be breached, and which is formally proved in the universal composability framework, based on number-theoretic assumptions.
Abstract: In this article, we propose a new voting protocol with several desirable security properties. The voting stage of the protocol can be performed by humans without computers; it provides every voter with the means to verify that all the votes were counted correctly (universal verifiability) while preserving ballot secrecy. The protocol has “everlasting privacy”: Even a computationally unbounded adversary gains no information about specific votes from observing the protocol's output. Unlike previous protocols with these properties, this protocol distributes trust between two authorities: a single corrupt authority will not cause voter privacy to be breached. Finally, the protocol is receipt-free: A voter cannot prove how she voted even if she wants to do so. We formally prove the security of the protocol in the universal composability framework, based on number-theoretic assumptions.

59 citations


Journal ArticleDOI
TL;DR: This article attempts to formally study two very intuitive physical models: sealed envelopes and locked boxes, often used as illustrations for common cryptographic operations, and considers three variations of tamper-evident seals, and shows that under some conditions they can be used to implement oblivious transfer, BC and coin flipping (CF).

38 citations


Journal ArticleDOI
TL;DR: Two computer scientists have created a video game about mice and elephants that can make computer encryption properly secure---as long as you play it randomly.
Abstract: Two computer scientists have created a video game about mice and elephants that can make computer encryption properly secure---as long as you play it randomly.

20 citations


Proceedings ArticleDOI
04 Oct 2010
TL;DR: This talk will explore a connection between traitor tracing schemes and the problem of sanitizing data to remove personal information while allowing statistically meaningful information to be released.
Abstract: In this talk I will explore a connection between traitor tracing schemes and the problem of sanitizing data to remove personal information while allowing statistically meaningful information to be released It is based on joint work with Cynthia Dwork, Omer Reingold, Guy N Rothblum and Salil Vadhan [5]