scispace - formally typeset
Search or ask a question

Showing papers by "Oded Regev published in 2006"


Book ChapterDOI
Oded Regev1
20 Aug 2006
TL;DR: Some of the recent progress on lattice-based cryptography is described, starting from the seminal work of Ajtai, and ending with some recent constructions of very efficient cryptographic schemes.
Abstract: We describe some of the recent progress on lattice-based cryptography, starting from the seminal work of Ajtai, and ending with some recent constructions of very efficient cryptographic schemes.

562 citations


Journal ArticleDOI
TL;DR: The complexity of the 2-locHam problem has been shown to be Ω(n 2 )-complete for any n ≥ 3 in the complexity class QMA as discussed by the authors.
Abstract: The $k$-{\locHam} problem is a natural complete problem for the complexity class $\QMA$, the quantum analogue of $\NP$. It is similar in spirit to {\sc MAX-$k$-SAT}, which is $\NP$-complete for $k\geq 2$. It was known that the problem is $\QMA$-complete for any $k \geq 3$. On the other hand, 1-{\locHam} is in {\P} and hence not believed to be $\QMA$-complete. The complexity of the 2-{\locHam} problem has long been outstanding. Here we settle the question and show that it is $\QMA$-complete. We provide two independent proofs; our first proof uses only elementary linear algebra. Our second proof uses a powerful technique for analyzing the sum of two Hamiltonians; this technique is based on perturbation theory and we believe that it might prove useful elsewhere. Using our techniques we also show that adiabatic computation with 2-local interactions on qubits is equivalent to standard quantum computation.

526 citations


Journal ArticleDOI
TL;DR: NICD, a generalization of noise sensitivity previously considered in [5, 31, 39], is extended to trees and the use of thereverse Bonami-Beckner inequality is used to prove a new isoperimetric inequality for the discrete cube and a new result on the mixing of short random walks on the cube.
Abstract: In this paper we studynon-interactive correlation distillation (NICD), a generalization of noise sensitivity previously considered in [5, 31, 39]. We extend the model toNICD on trees. In this model there is a fixed undirected tree with players at some of the nodes. One node is given a uniformly random string and this string is distributed throughout the network, with the edges of the tree acting as independent binary symmetric channels. The goal of the players is to agree on a shared random bit without communicating.

129 citations


Book ChapterDOI
28 May 2006
TL;DR: This work proposes an alternative method to attack signature schemes a la GGH, by studying the following learning problem: given many random points uniformly distributed over an unknown n-dimensional parallelepiped, recover the parallelePiped or an approximation thereof, which can be solved by a gradient descent.
Abstract: Lattice-based signature schemes following the Goldreich- Goldwasser-Halevi (GGH) design have the unusual property that each signature leaks information on the signer's secret key, but this does not necessarily imply that such schemes are insecure. At Eurocrypt '03, Szydlo proposed a potential attack by showing that the leakage reduces the key-recovery problem to that of distinguishing integral quadratic forms. He proposed a heuristic method to solve the latter problem, but it was unclear whether his method could attack real-life parameters of GGH and NTRU Cryptosystemssign. Here, we propose an alternative method to attack signature schemes a la GGH, by studying the following learning problem: given many random points uniformly distributed over an unknown n-dimensional parallelepiped, recover the parallelepiped or an approximation thereof. We transform this problem into a multivariate optimization problem that can be solved by a gradient descent. Our approach is very effective in practice: we present the first succesful key-recovery experiments on NTRU Cryptosystemssign-251 without perturbation, as proposed in half of the parameter choices in NTRU standards under consideration by IEEE P1363.1. Experimentally, 90,000 signatures are sufficient to recover the NTRU Cryptosystemssign-251 secret key. We are also able to recover the secret key in the signature analogue of all the GGH encryption challenges, using a number of signatures which is roughly quadratic in the lattice dimension.

121 citations


Proceedings ArticleDOI
21 May 2006
TL;DR: Reductions imply that the Shortest Vector Problem in the l1 norm and the Closest Vector problem with Preprocessing in the L1 norm are hard to approximate to within any constant (and beyond).
Abstract: We present reductions from lattice problems in the l2 norm to the corresponding problems in other norms such as l1, l∞ (and in fact in any other lp norm where 1 ≤ p ≤ ∞) We consider lattice problems such as the Shortest Vector Problem, Shortest Independent Vector Problem, Closest Vector Problem and the Closest Vector Problem with Preprocessing Most reductions are simple and follow from known constructions of embeddings of normed spacesAmong other things, our reductions imply that the Shortest Vector Problem in the l1 norm and the Closest Vector Problem with Preprocessing in the l∞ norm are hard to approximate to within any constant (and beyond) Previously, the former problem was known to be hard to approximate to within 2-e, while no hardness result was known for the latter problem

84 citations


Proceedings ArticleDOI
21 May 2006
TL;DR: It is proved that the problem ALMOST-3-COLORINGε is hard for any constant ε>0, assuming Khot's Unique Games conjecture, and the result is based on bounding various generalized noise-stability quantities using the invariance principle of Mossel et al.
Abstract: We study the APPROXCOLORING q(Q) problem: Given a graph G, decide whether χ(G) ≤ q or χ(G) ≥ Q. We derive conditional hardness for this problem for any constant 3 ≤ q 0, assuming Khot's Unique Games conjecture. This is the problem of deciding for a given graph, between the case where one can 3-color all but a e fraction of the vertices without monochromatic edges, and the case where the graph contains no independent set of relative size at least e.Our result is based on bounding various generalized noise-stability quantities using the invariance principle of Mossel et al [MOO'05].

52 citations


Proceedings ArticleDOI
21 May 2006
TL;DR: A direct product theorem is proved: if the authors're given two such problems, with optimal probabilities a and b, respectively, and the states in the first problem are pure, then the optimal probability for the joint bounded-error state identification problem is O(ab).
Abstract: We consider the problem of bounded-error quantum state identification: given either state α0 or state α1, we are required to output '0', '1' or 'DONO' ("don't know"), such that conditioned on outputting '0' or '1', our guess is correct with high probability. The goal is to maximize the probability of not outputting 'DONO'. We prove a direct product theorem: if we're given two such problems, with optimal probabilities a and b, respectively, and the states in the first problem are pure, then the optimal probability for the joint bounded-error state identification problem is O(ab). Our proof is based on semidefinite programming duality and may be of wider interest.Using this result, we present two exponential separations in the simultaneous message passing model of communication complexity. First, we describe a relation that can be computed with O(log n) classical bits of communication in the presence of shared randomness, but needs Ω(n1/3) communication if the parties don't share randomness, even if communication is quantum. This shows the optimality of Yao's recent exponential simulation of shared-randomness protocols by quantum protocols without shared randomness. Second, we describe a relation that can be computed with O(log n) classical bits of communication in the presence of shared entanglement, but needs Ω((n/log n)1/3) communication if the parties share randomness but no entanglement, even if communication is quantum. This is the first example in communication complexity where entanglement buys you much more than quantum communication does.

47 citations


Journal ArticleDOI
TL;DR: Combinatorial algorithms for the unsplittable flow problem (UFP) that either match or improve the previously best results are provided, and some can even be used in the online setting.
Abstract: We provide combinatorial algorithms for the unsplittable flow problem (UFP) that either match or improve the previously best results. In the UFP we are given a (possibly directed) capacitated graph with n vertices and m edges, and a set of terminal pairs each with its own demand and profit. The objective is to connect a subset of the terminal pairs each by a single flow path subject to the capacity constraints such that the total profit of the connected pairs is maximized.We consider three variants of the problem. First is the classical UFP in which the maximum demand is at most the minimum edge capacity. It was previously known to have an O(√m) approximation algorithm; the algorithm is based on the randomized rounding technique and its analysis makes use of the Chernoff bound and the FKG inequality.We provide a combinatorial algorithm that achieves the same approximation ratio and whose analysis is considerably simpler. Second is the extended UFP in which some demands might be higher than edge capacities. Our algorithm for this case improves the best known approximation ratio. We also give a lower bound that shows that the extended UFP is provably harder than the classical UFP. Finally, we consider the bounded UFP in which the maximum demand is at most 1/K times the minimum edge capacity for some K > 1. Here we provide combinatorial algorithms that match the currently best known algorithms. All of our algorithms are strongly polynomial and some can even be used in the online setting.

43 citations


Proceedings ArticleDOI
16 Jul 2006
TL;DR: The first hardness result for the covering radius problem on lattices was given in this article, where it was shown that for any large enough p /spl les/ /spl infin/ there exists a constant c/sub p/ > 1 such that CRP in the /spl lscr/ sub p/ norm is /spl Pi/sub 2/hard to approximate to within any constant less than c/Sub p/.
Abstract: We provide the first hardness result for the covering radius problem on lattices (CRP). Namely, we show that for any large enough p /spl les/ /spl infin/ there exists a constant c/sub p/ > 1 such that CRP in the /spl lscr//sub p/ norm is /spl Pi//sub 2/-hard to approximate to within any constant less than c/sub p/. In particular, for the case p = /spl infin/, we obtain the constant C/sub /spl infin// = 1.5. This gets close to the constant 2 beyond which the problem is not believed to be /spl Pi//sub 2/-hard. As part of our proof, we establish a stronger hardness of approximation result for the /spl forall//spl exist/-3-SAT problem with bounded occurrences. This hardness result might be useful elsewhere.

17 citations


Journal Article
TL;DR: In this paper, the authors proposed an alternative method to attack signature schemes a la GGH, by studying the following learning problem: given many random points uniformly distributed over an unknown n-dimensional parallelepiped, recover the parallel line or an approximation thereof, transform this problem into a multivariate optimization problem that can be solved by a gradient descent.
Abstract: Lattice-based signature schemes following the Goldreich-Goldwasser-Halevi (GGH) design have the unusual property that each signature leaks information on the signer's secret key, but this does not necessarily imply that such schemes are insecure. At Eurocrypt '03, Szydlo proposed a potential attack by showing that the leakage reduces the key-recovery problem to that of distinguishing integral quadratic forms. He proposed a heuristic method to solve the latter problem, but, it was unclear whether his method could attack real-life parameters of GGH and NTRUSIGN. Here, we propose an alternative method to attack signature schemes a la GGH, by studying the following learning problem: given many random points uniformly distributed over an unknown n-dimensional parallelepiped, recover the parallelepiped or an approximation thereof. We transform this problem into a multivariate optimization problem that can be solved by a gradient descent. Our approach is very effective in practice: we present the first succesful key-recovery experiments on NTRUSIGN-251 without perturbation, as proposed in half of the parameter choices in NTRU standards under consideration by IEEE P1363.1. Experimentally, 90,000 signatures are sufficient to recover the NTRUSIGN-251 secret key. We are also able to recover the secret key in the signature analogue of all the GGH encryption challenges, using a number of signatures which is roughly quadratic in the lattice dimension.

15 citations