scispace - formally typeset
Proceedings ArticleDOI

A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs

Stefan Droste, +2 more
- pp 499-504
TLDR
A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs is given and it is found that the expected run time of this algorithm is at most /spl Theta/(n ln n) forlinear functions with n variables.
Abstract
Evolutionary algorithms (EAs) are heuristic randomized algorithms which, by many impressive experiments, have been proven to behave quite well for optimization problems of various kinds. In this paper, a rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs is given. The analysis is carried out for different mutation rates. The main contribution of the paper is not the result that the expected run time of the (1+1) evolutionary algorithm is at most /spl Theta/(n ln n) for linear functions with n variables, but the presentation of methods showing how this result can be proven rigorously.

read more

Citations
More filters
Journal ArticleDOI

On the analysis of the (1+ 1) evolutionary algorithm

TL;DR: A step towards a theory on Evolutionary Algorithms, in particular, the so-called (1+1) evolutionary Algorithm, is performed and linear functions are proved to be optimized in expected time O(nlnn) but only mutation rates of size (1/n) can ensure this behavior.
Journal ArticleDOI

Drift analysis and average time complexity of evolutionary algorithms

TL;DR: While previous work only considered (1 + 1) EAs without any crossover, the EAs considered in this paper are fairly general, which use a finite population, crossover, mutation, and selection.
Journal ArticleDOI

A study of drift analysis for estimating computation time of evolutionary algorithms

TL;DR: This paper introduces drift analysis and its applications in estimating average computation time of evolutionary algorithms and a general classification of easy and hard problems for evolutionary algorithms is given based on the analysis.
Journal ArticleDOI

Towards an analytic framework for analysing the computation time of evolutionary algorithms

TL;DR: The first step towards a systematic comparative study among different EAs and their first hitting times has been made in the paper, and the framework is built on the absorbing Markov chain model of evolutionary algorithms.
Journal ArticleDOI

From an individual to a population: an analysis of the first hitting time of population-based evolutionary algorithms

TL;DR: It is shown that a population can have a drastic impact on an EA's average computation time, changing an exponential time to a polynomial time (in the input size) in some cases.
References
More filters
Book

Genetic algorithms in search, optimization, and machine learning

TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Book

Introduction to Algorithms

TL;DR: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures and presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers.
Book

Evolutionary Computation: Towards a New Philosophy of Machine Intelligence

TL;DR: In-depth and updated, Evolutionary Computation shows you how to use simulated evolution to achieve machine intelligence and carefully reviews the "no free lunch theorem" and discusses new theoretical findings that challenge some of the mathematical foundations of simulated evolution.
Journal ArticleDOI

Predictive models for the breeder genetic algorithm i. continuous parameter optimization

TL;DR: The numerical performance of the BGA is demonstrated on a test suite of multimodal functions and the number of function evaluations needed to locate the optimum scales only as n ln(n) where n is thenumber of parameters.