scispace - formally typeset
Search or ask a question

Showing papers on "Evolutionary computation published in 1994"


Journal ArticleDOI
TL;DR: The development of each of these procedures over the past 35 years is described and some recent efforts in these areas are reviewed.
Abstract: Natural evolution is a population-based optimization process. Simulating this process on a computer results in stochastic optimization techniques that can often outperform classical methods of optimization when applied to difficult real-world problems. There are currently three main avenues of research in simulated evolution: genetic algorithms, evolution strategies, and evolutionary programming. Each method emphasizes a different facet of natural evolution. Genetic algorithms stress chromosomal operators. Evolution strategies emphasize behavioral changes at the level of the individual. Evolutionary programming stresses behavioral change at the level of the species. The development of each of these procedures over the past 35 years is described. Some recent efforts in these areas are reviewed. >

1,549 citations


Book
01 Jan 1994
TL;DR: This third volume of Advances in Genetic Programming highlights many of the recent technical advances in this increasingly popular field.
Abstract: Genetic programming is a form of evolutionary computation that evolves programs and program-like executable structures for developing reliable time -- and cost-effective applications. It does this by breeding programs over many generations, using the principles of natural selection, sexual recombination, and mutuation. This third volume of Advances in Genetic Programming highlights many of the recent technical advances in this increasingly popular field.

805 citations


Proceedings ArticleDOI
27 Jun 1994
TL;DR: All important selection operators are discussed and quantitatively compared with respect to their selective pressure and it is clarified that only a few really different and useful selection operators exist: proportional selection, linear ranking, tournament selection, and (/spl mu/,/spl lambda/)-selection.
Abstract: Due to its independence of the actual search space and its impact on the exploration-exploitation tradeoff, selection is an important operator in any kind of evolutionary algorithm. All important selection operators are discussed and quantitatively compared with respect to their selective pressure. The comparison clarifies that only a few really different and useful selection operators exist: proportional selection (in combination with a scaling method), linear ranking, tournament selection, and (/spl mu/,/spl lambda/)-selection (respectively (/spl mu/+/spl lambda/)-selection). Their selective pressure increases in the order as they are listed here. The theoretical results are confirmed by an experimental investigation using a genetic algorithm with different selection methods on a simple unimodal objective function. >

380 citations


Book
01 Sep 1994
TL;DR: The aim of this book is to talk about the field of evolutionary computation in simple terms, and discuss the simplicity and elegance of its methods on many interesting test cases.
Abstract: From the Publisher: Genetic algorithms are founded upon the principle of evolution, i.e., survival of the fittest. Hence evolution programming techniques, based on genetic algorithms, are applicable to many hard optimization problems, such as optimization of functions with linear and nonlinear constraints, the traveling salesman problem, and problems of scheduling, partitioning, and control. The importance of these techniques is still growing, since evolution programs are parallel in nature, and parallelism is one of the most promising directions in computer science. The aim of this book is to talk about the field of evolutionary computation in simple terms, and discuss the simplicity and elegance of its methods on many interesting test cases. The book may serve as a guide to writing an evolution program, and to making this an enjoyable experience. It is self-contained and the only prerequisite is basic undergraduate mathematics. Aimed at researchers, practitioners, and graduate students, it may serve as a text for advanced courses in computer science and artificial intelligence, operations research, and engineering. This third edition has been substantially revised and extended. Three new chapters discuss the recent paradigm of genetic programming, heuristic methods and constraint handling, and current directions of research. Additional appendices contain test functions for experiments with evolutionary techniques and discuss possible projects for use in a project-oriented course.

374 citations


Journal ArticleDOI
TL;DR: The history and current scope of research on genetic algorithms in artificial life is reviewed, giving illustrative examples in which the genetic algorithm is used to study how learning and evolution interact, and to model ecosystems, immune system, cognitive systems, and social systems.
Abstract: Genetic algorithms are computational models of evolution that play a central role in many artificial-life models. We review the history and current scope of research on genetic algorithms in artificial life, giving illustrative examples in which the genetic algorithm is used to study how learning and evolution interact, and to model ecosystems, immune system, cognitive systems, and social systems. We also outline a number of open questions and future directions for genetic algorithms in artificial-life research.

205 citations


Book ChapterDOI
11 Apr 1994
TL;DR: The EP selection model is shown to be equivalent to an ES model in one form, and surprisingly similar to fitness proportionate selection in another, as well as being remarkably immune to evaluation noise, models that retain parents much less so.
Abstract: Selection methods in Evolutionary Algorithms, including Genetic Algorithms, Evolution Strategies (ES) and Evolutionary Programming, (EP) are compared by observing the rate of convergence on three idealised problems. The first considers selection only, the second introduces mutation as a source of variation, the third also adds in evaluation noise. Fitness proportionate selection suffers from scaling problems: a number of techniques to reduce these are illustrated. The sampling errors caused by roulette wheel and tournament selection are demonstrated. The EP selection model is shown to be equivalent to an ES model in one form, and surprisingly similar to fitness proportionate selection in another. Generational models are shown to be remarkably immune to evaluation noise, models that retain parents much less so.

149 citations


Book ChapterDOI
09 Oct 1994
TL;DR: It is demonstrated that simple paths to the global optimum can be so long that climbing the path is intractable, which means that a unimodal search space, which consists of a single hill, can be difficult for a hillclimber to optimize.
Abstract: We demonstrate the interesting, counter-intuitive result that simple paths to the global optimum can be so long that climbing the path is intractable. This means that a unimodal search space, which consists of a single hill and in which each point in the space is on a simple path to the global optimum, can be difficult for a hillclimber to optimize. Various types of hillclimbing algorithms will make constant progress toward the global optimum on such long path problems. They will continuously improve their best found solutions, and be guaranteed to reach the global optimum. Yet we cannot wait for them to arrive. Early experimental results indicate that a genetic algorithm (GA) with crossover alone outperforms hillclimbers on one such long path problem. This suggests that GAs can climb hills faster than hillclimbers by exploiting building blocks when they are present. Although these problems are artificial, they introduce a new dimension of problem difficulty for evolutionary computation. Path length can be added to the ranks of multimodality, deception/misleadingness, noise, variance, etc., as a measure of fitness landscapes and their amenability to evolutionary optimization.

143 citations


Journal ArticleDOI
TL;DR: The basic convergence properties of evolutionary optimization algorithms are investigated and it is indicated that the methods studied will asymptotically converge to global optima and genetic algorithms may prematurely stagnate at solutions that may not even be locally optimal.
Abstract: The basic convergence properties of evolutionary optimization algorithms are investigated. Analysis indicates that the methods studied will asymptotically converge to global optima. The results also indicate that genetic algorithms may prematurely stagnate at solutions that may not even be locally optimal. Function optimization experiments are conducted that illustrate the mathematical properties. Evolutionary programming is seen to outperform genetic algorithms in searching two response surfaces that do not possess local optima. The results are statistically significant.

117 citations


Journal ArticleDOI
TL;DR: The present paper demonstrates the applicability of genetic algorithms, as an optimization tool capable of overcoming combinatorial explosion, to the road‐maintenance planning problem at the network level.
Abstract: The present paper demonstrates the applicability of genetic algorithms, as an optimization tool capable of overcoming combinatorial explosion, to the road‐maintenance planning problem at the network level. Genetic algorithms are search algorithms based upon the principles of Darwinian evolution. The concept of the survival of the fittest is used in a structured, yet randomized, information exchange to form a robust search algorithm. Genetic algorithms efficiently exploit historical information to locate search points with improved performance. The theoretical basis and operations of genetic algorithms are presented. A computer model, PAVENET, formulated on the operating principles of genetic algorithms to serve as an analytical aid for pavement maintenance engineers, is introduced. The formulation of the PAVENET model is described in detail. Analyses are conducted to show the characteristics of important operating parameters of the PAVENET program. These parameters include: (1) Parent pool size; (2) mutat...

111 citations


Journal ArticleDOI
TL;DR: Recursive formulae for the GACS in the general infinite population case are derived and their validity is rigorously proven and it is shown how the increment of the population mean is driven by its own diversity and follows a modified Newton's search.
Abstract: This paper aims at establishing fundamental theoretical properties for a class of "genetic algorithms" in continuous space (GACS). The algorithms employ operators such as selection, crossover, and mutation in the framework of a multidimensional Euclidean space. The paper is divided into two parts. The first part concentrates on the basic properties associated with the selection and mutation operators. Recursive formulae for the GACS in the general infinite population case are derived and their validity is rigorously proven. A convergence analysis is presented for the classical case of a quadratic cost function. It is shown how the increment of the population mean is driven by its own diversity and follows a modified Newton's search. Sufficient conditions for monotonic increase of the population mean fitness are derived for a more general class of fitness functions satisfying a Lipschitz condition. The diversification role of the crossover operator is analyzed in Part II. The treatment adds much light to the understanding of the underlying mechanism of evolution-like algorithms. >

106 citations


Journal ArticleDOI
TL;DR: The process of Darwinian evolution by natural selection was inoculated into four artificial worlds for a comparative study of the rates, degrees and patterns of evolutionary optimizations, showing that many features of the evolutionary process are sensitive to the structure of the underlying genetic language.

01 Jan 1994
TL;DR: An analysis is given of a model of genetic programming dynamics that is supportive of the “Soft Brood Selection” conjecture, which was proposed as a means to counteract the emergence of highly conservative code, and instead favor highly evolvable code.
Abstract: Evolutionary computation systems exhibit various emergent phenomena, primary of which is adaptation. In genetic programming, because of the indeterminate nature of the representation, the evolution of both recombination distributions and representations can emerge from the population dynamics. A review of ideas on these phenomena is presented, including theory on the evolution of evolvability through differential proliferation of subexpressions within programs. An analysis is given of a model of genetic programming dynamics that is supportive of the “Soft Brood Selection” conjecture, which was proposed as a means to counteract the emergence of highly conservative code, and instead favor highly evolvable code.

Journal ArticleDOI
TL;DR: A construction of a new hybrid optimization system, Genocop II, is discussed and its experimental results on a few test cases (nonlinear programming problems) are presented.

Proceedings ArticleDOI
27 Jun 1994
TL;DR: Two approaches to solving the general timetable problem using evolutionary algorithms are described, which allow not only the production of feasible timetables but also the evolution of timetables that are 'good' with respect to some user-specified evaluation function.
Abstract: The general timetable problem, which involves the placing of events requiring limited resources into timeslots, has been approached in many different ways. This paper describes two approaches to solving the problem using evolutionary algorithms. The methods allow not only the production of feasible timetables but also the evolution of timetables that are 'good' with respect to some user-specified evaluation function. A major concern of any approach to the timetable problem is the large proportion of timetables in a search space where some resource is not available for some event. These timetables are said to be infeasible. The methods described transform the search space into one in which the proportion of feasible solutions is greatly increased. This new search space is then searched by an evolutionary algorithm. The chromosomes used are encoded instructions on how to build a timetable in a way that leads to the above-mentioned search space transformation. "Lamarckism", which allows information gained through interpretation of the chromosomes to be written back into the chromosomes, is also used. Test results, working with real world timetable requirements (for a university department's timetable), show a very fast evolution to a population of chromosomes which build feasible timetables, and subsequently evolution of chromosomes which build timetables which are optimal or nearly optimal. >

Proceedings ArticleDOI
27 Jun 1994
TL;DR: The paper offers sufficient conditions to prove global convergence of non-elitist evolutionary algorithms and if these conditions can be applied they yield bounds of the convergence rate as a by-product.
Abstract: The paper offers sufficient conditions to prove global convergence of non-elitist evolutionary algorithms. If these conditions can be applied they yield bounds of the convergence rate as a by-product. This is demonstrated by an example that can be calculated exactly. >

Journal ArticleDOI
TL;DR: This work proposes to decompose the overall task to fit in the behavior-based control architecture, and then to evolve the separate behavior modules and arbitrators using an evolutionary approach, so the job of defining fitness functions becomes more straightforward and the tasks easier to achieve.

Book ChapterDOI
09 Oct 1994
TL;DR: A parallel two-level evolutionary algorithm which evolves genetic algorithms of maximum convergence velocity is presented, which combines principles of evolution strategies and genetic algorithms in order to optimize continuous and discrete parameters of the genetic algorithms at the same time.
Abstract: A parallel two-level evolutionary algorithm which evolves genetic algorithms of maximum convergence velocity is presented The meta-algorithm combines principles of evolution strategies and genetic algorithms in order to optimize continuous and discrete parameters of the genetic algorithms at the same time (mixed-integer optimization)

Book ChapterDOI
27 Jun 1994
TL;DR: A fuzzy evolutionary algorithm (FEA) is presented by systematically integrating fuzzy expert systems with evolutionary algorithms in this paper to show that a trajectory of a 7 degree of freedom robot can be automatically generated using the proposed FEA.
Abstract: A fuzzy evolutionary algorithm (FEA) is presented by systematically integrating fuzzy expert systems with evolutionary algorithms in this paper. Both computer experiments and applications demonstrate that fuzzy evolutionary algorithms can generally search for optimal solutions faster and more effectively than standard genetic algorithms. As a specific application, a FEA is applied to automatic robot trajectory generation without using inverse kinematics. An example is given to show that a trajectory of a 7 degree of freedom robot can be automatically generated using the proposed FEA. >

Journal ArticleDOI
TL;DR: Results presented show the success of evolutionary programming in solving an example of a fractal inverse problem, but indicate that a genetic algorithm is not as successful.
Abstract: Over the past 30 years, algorithms that model natural evolution have generated robust search methods. These so-called evolutionary algorithms have been successfully applied to a wide range of problems. This paper discusses two types of evolutionary algorithms and their application to a problem in shape representation. Genetic algorithms and evolutionary programming, although both based on evolutionary principles, each place different emphasis on what drives the evolutionary process. While genetic algorithms rely on mimicking specific genotypic transformations, evolutionary programming emphasizes phenotypic adaptation. Results presented show the success of evolutionary programming in solving an example of a fractal inverse problem, but indicate that a genetic algorithm is not as successful. Reasons for this disparity are discussed.

Proceedings ArticleDOI
27 Jun 1994
TL;DR: An experimental result shows that through an evolutionary process based on the PGAs, a hardware specification program expands its circuit scale and as a result increases its functionality.
Abstract: Production genetic algorithms is proposed to enable grammar structure as well as hardware description language (HDL) programs to evolve, toward an automated hardware design system through an evolutionary process Evolutionary computation and methods make it possible to design hardware that works in unknown and non-stationary environments without explicit design knowledge In the proposed system, hardware specifications, which produce circuit behaviors, are automatically generated as HDL programs according to the grammar defined as in a rewriting system and then evolve through production genetic algorithms (PGAs), also proposed here The PGAs introduce new chromosome representation and genetic operators to create self-genesis mechanisms in hardware design similar to living systems An experimental result shows that through an evolutionary process based on the PGAs, a hardware specification program expands its circuit scale and as a result increases its functionality >

Proceedings ArticleDOI
27 Jun 1994
TL;DR: The results indicate the suitability for using EP to evolve neurocontrollers for these two systems and the objective is to bring the systems into balance.
Abstract: Evolutionary programming (EP) is a stochastic optimization technique that can be used to train neural networks. Unlike many training algorithms, EP does not require gradient information, and this facet increases the applicability of the procedure. The current investigation focuses on evolving neurocontrollers for two difficult nonlinear unstable systems. In the first, two separate poles of varying length are mounted on a cart. In the second, two jointed poles of varying length are mounted on a cart. The objective is to bring the systems into balance. The results indicate the suitability for using EP to evolve neurocontrollers for these two systems. >

Journal ArticleDOI
TL;DR: The most important features of ESs, namely their self-adaptation, as well as their robustness and potential for parallelization which they share with other evolutionary algorithms, are presented.
Abstract: Evolution strategies (ESs) are a special class of probabilistic, direct, global optimization methods. They are similar to genetic algorithms but work in continuous spaces and have the additional capability of self-adapting their major strategy parameters. This paper presents the most important features of ESs, namely their self-adaptation, as well as their robustness and potential for parallelization which they share with other evolutionary algorithms.

Proceedings ArticleDOI
27 Jun 1994
TL;DR: This paper provides a comprehensive and compact overview of hybrid work done in artificial intelligence, and shows the state of the art of combining artificial neural networks and evolutionary algorithms.
Abstract: This paper focuses on the intersection of neural networks and evolutionary computation. It is addressed to researchers from artificial intelligence as well as the neurosciences. It provides a comprehensive and compact overview of hybrid work done in artificial intelligence, and shows the state of the art of combining artificial neural networks and evolutionary algorithms. >

Book ChapterDOI
11 Apr 1994
TL;DR: The biological theory of sexual selection is reviewed and some possible applications ofSexual selection in evolutionary search, optimization, and diversification are reviewed.
Abstract: Sexual selection through mate choice is a powerful evolutionary process that has been important in the success of sexually-reproducing animals and flowering plants. Over the short term, mate preferences evolve because they improve the outcome of sexual recombination. Over the long term, assortative mate preferences can help maintain genetic diversity, promote speciation, and facilitate evolutionary search through optimal outbreeding; selective mate preferences can reinforce the speed, accuracy, and efficiency of natural selection, can foster the discovery and propagation of evolutionary innovations, and can function as aesthetic selection criteria. These strengths of sexual selection complement those of natural selection, so using both together may prove particularly fruitful in evolutionary computation. This paper reviews the biological theory of sexual selection and some possible applications of sexual selection in evolutionary search, optimization, and diversification. Simulation results are used to illustrate some key points.

Proceedings ArticleDOI
27 Jun 1994
TL;DR: Modal mutation schemes for evolutionary algorithms as a generalization of the breeder genetic algorithm mutation scheme are introduced and analyzed for multimodal continuous parameter optimization problems.
Abstract: With this paper modal mutation schemes for evolutionary algorithms as a generalization of the breeder genetic algorithm mutation scheme are introduced and analyzed for multimodal continuous parameter optimization problems. A new scaling rule for multiple mutations is formalized and compared with a new step-size scaling for evolution strategies. A performance comparison of the multivalued evolutionary algorithm with modal mutations with recently published results concerning the performance of Bayesian/sampling and very fast simulated reannealing techniques for global optimization is given. >

Proceedings ArticleDOI
27 Jun 1994
TL;DR: The results indicate that the GESA technique yields optimal or near-optimal solutions, superior to a version of simulated evolution and a versions of parallel simulated annealing.
Abstract: In this paper, we present a regionally guided approach to function optimization. The proposed technique is called "Guided Evolutionary Simulated Annealing". It combines the simulated annealing and simulated evolution in a novel way. The technique has a mechanism that the search will focus on more "promising" areas. The solution is evolved under regional guidance. The characteristics of the proposed technique are given. We illustrate the technique with two examples. The results of both examples indicate that the GESA technique yields optimal or near-optimal solutions, superior to a version of simulated evolution and a version of parallel simulated annealing. >

Book ChapterDOI
09 Oct 1994
TL;DR: This paper surveys structured population models, explains and motivates the benefits of generic systems such as RPL2 and describes the suite of applications that have used RPL1 to date.
Abstract: The Reproductive Plan Language RPL2 is an extensible, interpreted language for writing and using evolutionary computing programs. It supports arbitrary genetic representations, all structured population models described in the literature together with further hybrids, and runs on parallel or serial hardware while hiding parallelism from the user. This paper surveys structured population models, explains and motivates the benefits of generic systems such as RPL2 and describes the suite of applications that have used RPL2 to date.

Proceedings ArticleDOI
David Andre1
27 Jun 1994
TL;DR: The results indicate that the approach can evolve programs that store simple representations of their environments and use these representations to produce simple plans.
Abstract: An essential component of an intelligent agent is the ability to observe, encode, and use information about its environment. Traditional approaches to genetic programming have focused on evolving functional or reactive programs with only a minimal use of state. This paper presents an approach for investigating the evolution of learning, planning, and memory using genetic programming. The approach uses a multi-phasic fitness environment that enforces the use of memory and allows fairly straightforward comprehension of the evolved representations. An illustrative problem of 'gold' collection is used to demonstrate the usefulness of the approach. The results indicate that the approach can evolve programs that store simple representations of their environments and use these representations to produce simple plans. >

Proceedings ArticleDOI
27 Jun 1994
TL;DR: Results indicate thatMASK greatly outperforms GAs in the sense that MASK manages to deal with harder SAT instances at a lower cost.
Abstract: The paper compares two evolutionary methods for model finding in the satisfiability problem (SAT): genetic algorithms (GAs) and the mask method (MASK). The main characteristics of these two methods are that both of them are population-based, and use binary representation. Great care is taken to make sure that the same SAT instances and the same criteria are used in the comparison. Results indicate that MASK greatly outperforms GAs in the sense that MASK manages to deal with harder SAT instances at a lower cost. >

Journal ArticleDOI
TL;DR: The effectiveness, robustness, and fast convergence of modified genetic algorithms are demonstrated through the results of several examples, and Genetic algorithms are more capable of locating the global optimum.
Abstract: This paper presents the applications of genetic algorithms to nonlinear constrained mixed-discrete optimization problems that occur in engineering design. Genetic algorithms are heuristic combinatorial optimization strategies. Several strategies are adopted to enhance the search efficiency and reduce the computational cost. The effectiveness, robustness, and fast convergence of modified genetic algorithms are demonstrated through the results of several examples. Moreover, genetic algorithms are more capable of locating the global optimum.