Proceedings ArticleDOI
Fast genetic algorithms
Benjamin Doerr,Huu Phuoc Le,Régis Makhmara,Ta Duy Nguyen +3 more
- pp 777-784
TLDR
This work proposes a random mutation rate α/n, where α is chosen from a power-law distribution and proves that the (1 + 1) EA with this heavy-tailed mutation rate optimizes any Jumpm, n function in a time that is only a small polynomial factor above the one stemming from the optimal rate for this m.Abstract:
For genetic algorithms (GAs) using a bit-string representation of length n, the general recommendation is to take 1/n as mutation rate. In this work, we discuss whether this is justified for multi-modal functions. Taking jump functions and the (1+1) evolutionary algorithm (EA) as the simplest example, we observe that larger mutation rates give significantly better runtimes. For the Jumpm, n function, any mutation rate between 2/n and m/n leads to a speedup at least exponential in m compared to the standard choice.The asymptotically best runtime, obtained from using the mutation rate m/n and leading to a speed-up super-exponential in m, is very sensitive to small changes of the mutation rate. Any deviation by a small (1 ± e) factor leads to a slow-down exponential in m. Consequently, any fixed mutation rate gives strongly sub-optimal results for most jump functions.Building on this observation, we propose to use a random mutation rate α/n, where α is chosen from a power-law distribution. We prove that the (1 + 1) EA with this heavy-tailed mutation rate optimizes any Jumpm, n function in a time that is only a small polynomial (in m) factor above the one stemming from the optimal rate for this m. Our heavy-tailed mutation operator yields similar speed-ups (over the best known performance guarantees) for the vertex cover problem in bipartite graphs and the matching problem in general graphs.Following the example of fast simulated annealing, fast evolution strategies, and fast evolutionary programming, we propose to call genetic algorithms using a heavy-tailed mutation operator fast genetic algorithms.read more
Citations
More filters
Book ChapterDOI
Probabilistic Tools for the Analysis of Randomized Optimization Heuristics.
TL;DR: This chapter collects several probabilistic tools that have proven to be useful in the analysis of randomized search heuristics, including classic material such as the Markov, Chebyshev, and Chernoff inequalities, but also lesser-known topics such as stochastic domination and coupling.
Journal ArticleDOI
Standard Steady State Genetic Algorithms Can Hillclimb Faster Than Mutation-Only Evolutionary Algorithms
Dogan Corus,Pietro S. Oliveto +1 more
TL;DR: In this paper, a Markov chain framework was devised to rigorously prove an upper bound on the runtime of standard steady state GAs to hillclimb the OneMax function.
Book ChapterDOI
Theory of Parameter Control for Discrete Black-Box Optimization: Provable Performance Gains Through Dynamic Parameter Choices
Benjamin Doerr,Carola Doerr +1 more
TL;DR: This chapter surveys running-time results for a broad range of different parameter control mechanisms, and puts them into context by proposing an updated classification scheme for parameter control.
Proceedings ArticleDOI
The (1+λ) evolutionary algorithm with self-adjusting mutation rate
TL;DR: It is proved that this dynamic version of the (1 + λ) EA finds the optimum in an expected optimization time (number of fitness evaluations) of O(nλ/log λ + n log n).
Posted Content
Fast Genetic Algorithms
TL;DR: In this article, a heavy-tailed mutation operator was proposed for genetic algorithms with a bit-string representation of length n. The algorithm achieves a speedup at least exponential in n/m$ compared to the standard mutation rate.
References
More filters
Journal ArticleDOI
Evolutionary programming made faster
Xin Yao,Yong Liu,Guangming Lin +2 more
TL;DR: A "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator and is proposed and tested empirically, showing that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.
Journal ArticleDOI
Fast simulated annealing
TL;DR: In this article, a fast simulated annealing (FSA) algorithm is proposed, which is a semi-local search and consists of occasional long jumps, and the cooling schedule of the FSA algorithm is inversely linear in time.
Journal ArticleDOI
On the analysis of the (1+ 1) evolutionary algorithm
TL;DR: A step towards a theory on Evolutionary Algorithms, in particular, the so-called (1+1) evolutionary Algorithm, is performed and linear functions are proved to be optimized in expected time O(nlnn) but only mutation rates of size (1/n) can ensure this behavior.
Book ChapterDOI
Fast Evolution Strategies
TL;DR: It is shown empirically that the new evolution strategy based on Cauchy mutation outperforms the classical evolution strategy on most of the 23 benchmark problems tested in this paper.
Proceedings Article
Optimal Mutation Rates in Genetic Search
TL;DR: The results indicate that a variation of the mutation rate is useful in cases where the tness function is a multimodal pseudo boolean function where multimodality may be caused by the objective function as well as the encoding mechanism.