scispace - formally typeset
Search or ask a question
Topic

Las Vegas algorithm

About: Las Vegas algorithm is a research topic. Over the lifetime, 130 publications have been published within this topic receiving 4340 citations.


Papers
More filters
Proceedings ArticleDOI
23 Oct 2005
TL;DR: This work considers Nash equilibria in 2-player random games and analyzes a simple Las Vegas algorithm for finding an equilibrium, which runs in time O(m/sup 2/n log log n + n/Sup 2/m log log m) with high probability.
Abstract: We consider Nash equilibria in 2-player random games and analyze a simple Las Vegas algorithm for finding an equilibrium. The algorithm is combinatorial and always finds a Nash equilibrium; on m /spl times/ n payoff matrices, it runs in time O(m/sup 2/n log log n + n/sup 2/m log log m) with high probability. Our main tool is a polytope formulation of equilibria.

31 citations

Posted Content
TL;DR: The complexity of solving a polynomial system is decreased from $\widetilde{O}(D^3)$ to $D(d^\omega)$ where $D$ is the number of solutions of the system and new algorithms which rely on fast linear algebra are proposed.
Abstract: Polynomial system solving is a classical problem in mathematics with a wide range of applications. This makes its complexity a fundamental problem in computer science. Depending on the context, solving has different meanings. In order to stick to the most general case, we consider a representation of the solutions from which one can easily recover the exact solutions or a certified approximation of them. Under generic assumption, such a representation is given by the lexicographical Grobner basis of the system and consists of a set of univariate polynomials. The best known algorithm for computing the lexicographical Grobner basis is in $\widetilde{O}(d^{3n})$ arithmetic operations where $n$ is the number of variables and $d$ is the maximal degree of the equations in the input system. The notation $\widetilde{O}$ means that we neglect polynomial factors in $n$. We show that this complexity can be decreased to $\widetilde{O}(d^{\omega n})$ where $2 \leq \omega < 2.3727$ is the exponent in the complexity of multiplying two dense matrices. Consequently, when the input polynomial system is either generic or reaches the Bezout bound, the complexity of solving a polynomial system is decreased from $\widetilde{O}(D^3)$ to $\widetilde{O}(D^\omega)$ where $D$ is the number of solutions of the system. To achieve this result we propose new algorithms which rely on fast linear algebra. When the degree of the equations are bounded uniformly by a constant we propose a deterministic algorithm. In the unbounded case we present a Las Vegas algorithm.

31 citations

Proceedings ArticleDOI
05 Aug 2011
TL;DR: A fast and memory-sparing probabilistic top k selection algorithm on the GPU that always gives a correct result and always terminates.
Abstract: We implement here a fast and memory-sparing probabilistic top k selection algorithm on the GPU. The algorithm proceeds via an iterative probabilistic guess-and-check process on pivots for a three-way partition. When the guess is correct, the problem is reduced to selection on a much smaller set. This probabilistic algorithm always gives a correct result and always terminates. Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.

30 citations

Proceedings ArticleDOI
26 Jun 2006
TL;DR: There exist queries in XQuery, XPath, and relational algebra, such that any (randomized) Las Vegas algorithm that evaluates these queries must perform Ω(logN) random accesses to external memory devices, provided that the internal memory size is at most O(4√N/logN), where N denotes the size of the input data.
Abstract: We study the randomized version of a computation model (introduced in [9, 10]) that restricts random access to external memory and internal memory space Essentially, this model can be viewed as a powerful version of a data stream model that puts no cost on sequential scans of external memory (as other models for data streams) and, in addition, (like other external memory models, but unlike streaming models), admits several large external memory devices that can be read and written to in parallelWe obtain tight lower bounds for the decision problems set equality, multiset equality, and checksort More precisely, we show that any randomized one-sided-error bounded Monte Carlo algorithm for these problems must perform Ω(logN) random accesses to external memory devices, provided that the internal memory size is at most O(4√N/logN), where N denotes the size of the input dataFrom the lower bound on the set equality problem we can infer lower bounds on the worst case data complexity of query evaluation for the languages XQuery, XPath, and relational algebra on streaming data More precisely, we show that there exist queries in XQuery, XPath, and relational algebra, such that any (randomized) Las Vegas algorithm that evaluates these queries must perform Ω(logN) random accesses to external memory devices, provided that the internal memory size is at most O(4√N/logN)

30 citations

Journal ArticleDOI
TL;DR: A Las Vegas probabilistic algorithm is presented that finds the Smith normal form of a nonsingular input matrix A ∈ Z n × n using standard integer, polynomial, and matrix arithmetic, and results improve significantly on previous algorithms.

28 citations

Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
74% related
Pathwidth
8.3K papers, 252.2K citations
73% related
Approximation algorithm
23.9K papers, 654.3K citations
71% related
Hash function
31.5K papers, 538.5K citations
69% related
Line graph
11.5K papers, 304.1K citations
69% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20221
20214
20209
20199
20184
20177