scispace - formally typeset
Search or ask a question

Showing papers in "Computational Complexity in 2004"


Journal ArticleDOI
TL;DR: This paper addresses the issue of proving strong direct product assertions, that is, ones in which s' \approx ks and is in particular larger than s, for decision trees and communication protocols.
Abstract: A fundamental question of complexity theory is the direct product question. A famous example is Yao's XOR-lemma, in which one assumes that some function f is hard on average for small circuits (meaning that every circuit of some fixed size s which attempts to compute f is wrong on a non-negligible fraction of the inputs) and concludes that every circuit of size s' only has a small advantage over guessing randomly when computing f⊕k(x1,...,xk) = f(x1) ⊕...⊕ f(xk) on independently chosen x1,...,xk. All known proofs of this lemma have the property that s' < s. In words, the circuit which attempts to compute f⊕k is smaller than the circuit which attempts to compute f on a single input! This paper addresses the issue of proving strong direct product assertions, that is, ones in which s' ≈ ks and is in particular larger than s. We study the question of proving strong direct product question for decision trees and communication protocols.

93 citations


Journal ArticleDOI
TL;DR: It is proved that any space s quantum stochastic process from this class can be simulated probabilistically with unbounded error in space O(s), and therefore deterministically in spaceO(s2).
Abstract: This paper studies the space-complexity of predicting the long-term behavior of a class of stochastic processes based on evolutions and measurements of quantum mechanical systems. These processes generalize a wide range of both quantum and classical space-bounded computations, including unbounded error computations given by machines having algebraic number transition amplitudes or probabilities. It is proved that any space s quantum stochastic process from this class can be simulated probabilistically with unbounded error in space O(s), and therefore deterministically in space O(s2).

69 citations


Journal ArticleDOI
TL;DR: This paper shows how to extend the argument due to Bonet, Pitassi and Raz to show that bounded-depth Frege proofs do not have feasible interpolation, assuming that factoring of Blum integers or computing the Diffie–Hellman function is sufficiently hard.
Abstract: In this paper, we show how to extend the argument due to Bonet, Pitassi and Raz to show that bounded-depth Frege proofs do not have feasible interpolation, assuming that factoring of Blum integers or computing the Diffie-Hellman function is sufficiently hard. It follows as a corollary that bounded-depth Frege is not automatizable; in other words, there is no deterministic polynomial-time algorithm that will output a short proof if one exists. A notable feature of our argument is its simplicity.

47 citations


Journal ArticleDOI
TL;DR: It is proved that computing the maximum number of \vd \lb $s,t$-paths is \apx--complete for any length bound $\length\geq 5$ and that both problems are polynomially solvable.
Abstract: Let G = (V,E) be a simple graph and s and t be two distinct vertices of G. A path in G is called l-bounded for some l ∈ N if it does not contain more than l edges. We prove that computing the maximum number of vertex-disjoint l-bounded s, t-paths is APX-complete for any l ≥ 5. This implies that the problem of finding k vertex-disjoint l-bounded s, t-paths with minimal total weight for a given number k ∈ N, 1 ≤ k ≤ |V| - 1, and nonnegative weights on the edges of G is NPO-complete for any length bound l ≥ 5. furthermore, we show that these results are tight in the sense that for l ≤ 4 both problems are polynomially solvable, assuming that the weights satisfy a generalized triangle inequality in the weighted problem. Similar results are obtained for the analogous problems with path lengths equal to l instead of at most l and with edge-disjointness instead of vertex-disjointness.

43 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that if a language in AM has a polynomial time non-deterministic algorithm such that it is infeasible to come up with inputs on which the algorithm fails, then the AM ∩ coAM = NP ∆ coNP.
Abstract: Impagliazzo and Wigderson proved a uniform hardness vs. randomness "gap theorem" for BPP. We show an analogous result for AM: Either Arthur-Merlin protocols are very strong and everything in E = DTIME(2O(n)) can be proved to a subexponential time verifier, or else Arthur-Merlin protocols are weak and every language in AM has a polynomial time nondeterministic algorithm such that it is infeasible to come up with inputs on which the algorithm fails. We also show that if Arthur-Merlin protocols are not very strong (in the sense explained above) then AM ∩ coAM = NP ∩ coNP.Our technique combines the nonuniform hardness versus randomness tradeoff of Miltersen and Vinodchandran with "instance checking". A key ingredient in our proof is identifying a novel "resilience" property of hardness vs. randomness tradeoffs.

37 citations


Journal ArticleDOI
TL;DR: It is shown that Simon’s problem and its extended version can be deterministically solved in a simpler and more concrete way than that proposed by G. Brassard and P. Høyer.
Abstract: D. R. Simon stated a problem, so-called Simon's problem, whose computational complexity is in the class BQPφ but not in BPPφ, where φ is the function or oracle given in the problem. This result indicates that BPP may be strictly included in its quantum counterpart, BQP. Later, G. Brassard and P. Hoyer showed that Simon's problem and its extended version can be solved by a deterministic polynomial time quantum algorithm. That is, these problems are in the class EQPφ. In this paper, we show that Simon's problem and its extended version can be deterministically solved in a simpler and more concrete way than that proposed by G. Brassard and P. Hoyer.

16 citations


Journal ArticleDOI
Qi Cheng1
TL;DR: The results suggest that the Torsion Theorem may be viewed as a lower bound result in algebraic complexity, and a lot can be learned from the proof of the Uniform Boundedness Theorem to construct the proofs of the WL-conjecture or even the L- Conjecture.
Abstract: In this paper, we show several connections between the L-conjecture, proposed by Burgisser, and the boundedness theorem for the torsion points on elliptic curves. Assuming the WL-conjecture, which is a much weaker version of the L-conjecture, a sharper bound is obtained for the number of torsion points over extensions of k on an elliptic curve over a number field k, which improves Masser's result. It is also shown that the Torsion Theorem for elliptic curves follows directly from the WL-conjecture. Since the current proof of the Torsion Theorem for elliptic curves uses considerable machinery from arithmetic geometry, and the WL-conjecture differs from the trivial lower bound only at a constant factor, these results provide an interesting example where increasing the constant factor in a trivial lower bound of straight-line complexity is very difficult. Our results suggest that the Torsion Theorem may be viewed as a lower bound result in algebraic complexity, and a lot can be learned from the proof of the Uniform Boundedness Theorem to construct the proofs of the WL-conjecture or even the L-conjecture.

15 citations


Journal ArticleDOI
TL;DR: An exponential lower bound on the size of a decision tree for this function is obtained, and an asymptotic formula is derived, having a linear main term, for its average sensitivity is derived.
Abstract: We study various combinatorial complexity measures of Boolean functions related to some natural arithmetic problems about binary polynomials, that is, polynomials over F2. In particular, we consider the Boolean function deciding whether a given polynomial over F2 is squarefree. We obtain an exponential lower bound on the size of a decision tree for this function, and derive an asymptotic formula, having a linear main term, for its average sensitivity. This allows us to estimate other complexity characteristics such as the formula size, the average decision tree depth and the degrees of exact and approximative polynomial representations of this function. Finally, using a different method, we show that testing squarefreeness and irreducibility of polynomials over F2 cannot be done in AC0[p] for any odd prime p. Similar results are obtained for deciding coprimality of two polynomials over F2 as well.

6 citations


Journal ArticleDOI
TL;DR: This work characterize iterated log depth circuit classes between AC0 and AC1 by Cobham-like bounded recursion schemata and gives alternative characterizations which utilize the safe recursion method developed by Bellantoni and Cook.
Abstract: We characterize iterated log depth circuit classes between AC0 and AC1 by Cobham-like bounded recursion schemata. We also give alternative characterizations which utilize the safe recursion method developed by Bellantoni and Cook.

4 citations