scispace - formally typeset
Book ChapterDOI

On the Power of Randomized Pushdown Automata

TLDR
The power of randomization for pushdown automata is investigated by investigating the power of decreasing error probabilities, and it is shown that deterministic push down automata are weaker than Las Vegas pushdown Automata, which in turn are stronger than one-sided-error pushdownAutomata.
Abstract
Although randomization is now a standard tool for the design of efficient algorithms or for building simpler systems, we are far from fully understanding the power of randomized computing. Hence it is advisable to study randomization for restricted models of computation.We followthis approach by investigating the power of randomization for pushdown automata.Our main results are as follows. First we show that deterministic pushdown automata are weaker than Las Vegas pushdown automata, which in turn are weaker than one-sided-error pushdown automata. Finally one-sided-error pushdown automata are weaker than (nondeterministic) pushdown automata.In contrast to many other fundamental models of computing there are no known methods of decreasing error probabilities.We show that such methods do not exist by constructing languages which are recognizable by one-sided-error pushdown automata with error probability 1/2, but not by one-sided-error pushdown automata with error probability p < 1/2. On the other hand we construct languages which are not deterministic context-free (resp. not context-free) but are recognizable with arbitrarily small error by one-sided-error (resp. bounded-error) pushdown automata.

read more

Citations
More filters
Journal ArticleDOI

On probabilistic pushdown automata

TL;DR: Deterministic pushdown automata (pda) are shown to be weaker than Las Vegas pda, which in turn are weaker than one-sided-error pda; bounded-error two-sided error pda and nondeterministic pda are incomparable, and error probabilities can in general not be decreased arbitrarily.
Journal ArticleDOI

Expressive Power of Quantum Pushdown Automata with Classical Stack Operations under the Perfect-Soundness Condition

TL;DR: This paper investigates the power of quantum pushdown automata whose stacks are assumed to be implemented as classical devices, and shows that they are strictly more powerful than their classical counterparts under the perfect-soundness condition.
Journal ArticleDOI

On the power of randomized multicounter machines

TL;DR: It is shown that polynomial-time one-way multicounter machines, with error probability tending to zero with growing input length, can recognize languages that cannot be accepted by polynometric-time nondeterministic two-way Multicounter machines with a bounded number of reversals.
Journal Article

A Separation of Determinism, Las Vegas and Nondeterminism for Picture Recognition

TL;DR: In this article, a strong separation between the power of determinism, Las Vegas randomization, and non-eterminism for a computing model is proved, and the computing models considered here are finite automata with two-dimensional input tapes.
Book ChapterDOI

Pushdown automata and multicounter machines, a comparison of computation modes

TL;DR: Here, the polynomial-time classes of multicounter machines with a constant number of reversals are considered and the computational power of nondeterminism, randomization and determinism is separated.
References
More filters
Book

Communication Complexity

TL;DR: This chapter surveys the theory of two-party communication complexity and presents results regarding the following models of computation: • Finite automata • Turing machines • Decision trees • Ordered binary decision diagrams • VLSI chips • Networks of threshold gates.
Proceedings ArticleDOI

Some complexity questions related to distributive computing(Preliminary Report)

TL;DR: The quantity of interest, which measures the information exchange necessary for computing f, is the minimum number of bits exchanged in any algorithm.
Journal ArticleDOI

Computational Complexity of Probabilistic Turing Machines

TL;DR: It is shown that every nondeterministic machine can be simulated in the same space by a probabilistic machine with small error probability.
Proceedings ArticleDOI

Las Vegas is better than determinism in VLSI and distributed computing (Extended Abstract)

TL;DR: A new method for proving lower bounds on the complexity of VLSI - computations and more generally distributed computations which only applies to deterministic computations.