scispace - formally typeset
Search or ask a question

Showing papers on "Evolutionary computation published in 1998"


Proceedings ArticleDOI
04 May 1998
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through "genetic" operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are "evolved" by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer.

9,373 citations


Book ChapterDOI
TL;DR: This paper compares two evolutionary computation paradigms: genetic algorithms and particle swarm optimization, and suggests ways in which performance might be improved by incorporating features from one paradigm into the other.
Abstract: This paper compares two evolutionary computation paradigms: genetic algorithms and particle swarm optimization. The operators of each paradigm are reviewed, focusing on how each affects search behavior in the problem space. The goals of the paper are to provide additional insights into how each paradigm works, and to suggest ways in which performance might be improved by incorporating features from one paradigm into the other.

1,661 citations


Journal ArticleDOI
TL;DR: Different models of genetic operators and some mechanisms available for studying the behaviour of this type of genetic algorithms are revised and compared.
Abstract: Genetic algorithms play a significant role, as search techniques for handling complex spaces, in many fields such as artificial intelligence, engineering, robotic, etc. Genetic algorithms are based on the underlying genetic process in biological organisms and on the natural evolution principles of populations. These algorithms process a population of chromosomes, which represent search space solutions, with three operations: selection, crossover and mutation. Under its initial formulation, the search space solutions are coded using the binary alphabet. However, the good properties related with these algorithms do not stem from the use of this alphabet; other coding types have been considered for the representation issue, such as real coding, which would seem particularly natural when tackling optimization problems of parameters with variables in continuous domains. In this paper we review the features of real-coded genetic algorithms. Different models of genetic operators and some mechanisms available for studying the behaviour of this type of genetic algorithms are revised and compared.

1,190 citations


Book ChapterDOI
TL;DR: This paper investigates the philosophical and performance differences of particle swarm and evolutionary optimization by comparison experiments involving four non-linear functions well studied in the evolutionary optimization literature.
Abstract: This paper investigates the philosophical and performance differences of particle swarm and evolutionary optimization. The method of processing employed in each technique are first reviewed followed by a summary of their philosophical differences. Comparison experiments involving four non-linear functions well studied in the evolutionary optimization literature are used to highlight some performance differences between the techniques.

1,163 citations


Proceedings ArticleDOI
04 May 1998
TL;DR: A hybrid based on the particle swarm algorithm but with the addition of a standard selection mechanism from evolutionary computations is described that shows selection to provide an advantage for some (but not all) complex functions.
Abstract: This paper describes a evolutionary optimization algorithm that is a hybrid based on the particle swarm algorithm but with the addition of a standard selection mechanism from evolutionary computations. A comparison is performed between the hybrid swarm and the ordinary particle swarm that shows selection to provide an advantage for some (but not all) complex functions.

897 citations


Journal ArticleDOI
TL;DR: In this article, the authors reviewed various strategies of sharing and proposed new recombination schemes to improve its efficiency, and compared the sharing method with other niching techniques for high and a limited number of fitness function evaluations.
Abstract: Interest in multimodal optimization function is expanding rapidly since real-world optimization problems often require the location of multiple optima in the search space. In this context, fitness sharing has been used widely to maintain population diversity and permit the investigation of manly peaks in the feasible domain. This paper reviews various strategies of sharing and proposes new recombination schemes to improve its efficiency. Some empirical results are presented for high and a limited number of fitness function evaluations. Finally, the study compares the sharing method with other niching techniques.

493 citations


Journal ArticleDOI
R. Salomon1
TL;DR: To what extent a hybrid method, the evolutionary-gradient-search procedure, can be used beneficially in the field of continuous parameter optimization is explored.
Abstract: Classical gradient methods and evolutionary algorithms represent two very different classes of optimization techniques that seem to have very different properties. This paper discusses some aspects of some "obvious" differences and explores to what extent a hybrid method, the evolutionary-gradient-search procedure, can be used beneficially in the field of continuous parameter optimization. Simulation experiments show that on some test functions, the hybrid method yields faster convergence than pure evolution strategies, but that on other test functions, the procedure exhibits the same deficiencies as steepest-descent methods.

283 citations


Journal ArticleDOI
01 Feb 1998
TL;DR: In this paper, the authors provide an overview and a list of references on the use of evolutionary algorithms in power systems and related fields, and present two applications of EA for two different problems in Power Systems.
Abstract: This paper provides an overview and a list of references on the use of Evolutionary Algorithms (EA) in Power Systems and related fields. As didactic examples, the paper presents two applications of EA for two different problems in Power Systems.

206 citations


Journal ArticleDOI
TL;DR: This survey is the attempt to summarize the results regarding the limit and finite time behavior of evolutionary algorithms with finite search spaces and discrete time scale.
Abstract: The theory of evolutionary computation has been enhanced rapidly during the last decade. This survey is the attempt to summarize the results regarding the limit and finite time behavior of evolutionary algorithms with finite search spaces and discrete time scale. Results on evolutionary algorithms beyond finite space and discrete time are also presented but with reduced elaboration.

193 citations


BookDOI
01 Nov 1998
TL;DR: Fusion of Neural Networks, Fuzzy Systems and Genetic Algorithms covers the spectrum of applications - comprehensively demonstrating the advantages of fusion techniques in industrial applications.
Abstract: From the Publisher: Fusion of Neural Networks, Fuzzy Systems and Genetic Algorithms integrates neural networks, fuzzy systems, and evolutionary computing in system design that enables its readers to handle complexity - offsetting the demerits of one paradigm by the merits of another. This book presents specific projects where fusion techniques have been applied. The chapters start with the design of a new fuzzy-neural controller. Remaining chapters discuss the application of expert systems, neural networks, fuzzy control, and evolutionary computing techniques in modern engineering systems. Fusion of Neural Networks, Fuzzy Systems and Genetic Algorithms covers the spectrum of applications - comprehensively demonstrating the advantages of fusion techniques in industrial applications.

184 citations


Book
04 Dec 1998
TL;DR: This work focuses on the development of systems for on-line Adaptive Decision Making and Control for Evolutionary Programming, as well as aspects of Evolutionary Design by Computers.
Abstract: 1: Keynote Papers.- The NIST Design Repository Project.- Evolving Connectionist and Fuzzy-Connectionist Systems for On-line Adaptive Decision Making and Control.- Recent New Development in Evolutionary Programming.- Emotional Image Retrieval with Interactive Evolutionary Computation.- 2: Design Support Systems.- Using Genetic Algorithms to Encourage Engineering Design Creativity.- Abduction Problem in Probabilistic Constraint Logic Programming.- Aspects of Evolutionary Design by Computers.- Surface Optimisation within the CAD/CAM Environment using Genetic Algorithms.- 3: Intelligent Control.- Adaptive Sugeno Fuzzy Control: A Case Study.- An Experimental and Comparative Study of Fuzzy PID Control Structures.- An Accurate COG Defuzzifier Design Using the Coadaptation of Learning and Evolution.- A Multiagent Intelligent Control System for Glass Industry.- Predictive Control Using Fuzzy Models.- Evolutionary Design of a Helicopter Autopilot.- Decomposition of a Fuzzy Controller Based on Inference Break-up Method.- 4: Identification and Modelling.- Experimental Evaluation of Intelligent Identification Algorithms Applied to a Wind Tunnel Process.- Improvement of Membership Function Identification Method in Usability and Precision.- General Parameter Radial Basis Function Neural Network Based Adaptive Fuzzy Systems.- Uneven Division of Input Spaces for Hierarchical Fuzzy Modeling.- Ensembles of Evolutionary created Artificial Neural Networks and Nearest Neighbour Classifiers.- 5: Data Mining.- Application of Multi-dimensional Fuzzy Analysis to Decision Making.- Information-Theoretic Fuzzy Approach to Knowledge Discovery in Databases.- Intelligent Electronic Catalogs for Sales Support - Introducing Case-Based Reasoning Techniques to on-line Product Selection Applications.- A Genetic Algorithm for Generalized Rule Induction.- 6: Optimisation.- Multiobjective Optimization by Nessy Algorithm.- The Scout Algorithm applied to the Maximum Clique Problem.- Unconstrained Optimization Using Genetic Box Search.- Improvement of Simple Genetic Algorithm for Solving the Uncapacitated Warehouse Location Problem.- Optimizing Neural Networks for Time Series Prediction.- 7: Optimisation for Industrial Applications.- Maximum Entropy Image Restoration by Evolutionary Algorithm.- The Finite Element Method and Soft Computing.- A Tabu Search Approach for the Tool Assignment and Machine Loading Problem in Flexible Manufacturing Systems.- Investigating Evolutionary Optimisation of Constrained Functions to Capture Shape Descriptions from Range Data.- Optimal Selection of Pressure Vessels.- 8: New Topics in EA Basics.- Simulation of Baldwin Effect and Dawkins Memes by Genetic Algorithm.- Approach to Structure Synthesis by Genetic Algorithms.- A Study of Altruism by Genetic Algorithm.- The Bivariate Marginal Distribution Algorithm.- 9: New Frontier for Soft Computing.- Granular Computing using Neighborhood Systems.- Toward Fuzziness in Natural Language Processing.- A New Approach to Acquisition of Comprehensible Fuzzy Rules.- Zero-Point Probability for Linear Source Separation.- Code Optimization for DNA Computing of Maximal Cliques.- 10: Summary of Tutorials.- On Line Tutorials on Evolutionary Computing.- Fuzzy Control Tutorial.- 11: Summary of Discussion.- 11: Summary of Discussion.- Keyword Index.- List of Reviewers.

Proceedings ArticleDOI
04 May 1998
TL;DR: It is shown that results known from the theory of evolutionary algorithms in case of single-criterion optimization do not carry over to the multi-criteria case, and a theoretical analysis shows that a special version of an evolutionary algorithm with this step size rule converges with probability one to the Pareto set for the test problem under consideration.
Abstract: Although there are many versions of evolutionary algorithms that are tailored to multi-criteria optimization, theoretical results are apparently not yet available. In this paper, it is shown that results known from the theory of evolutionary algorithms in case of single-criterion optimization do not carry over to the multi-criterion case. At first, three different step size rules are investigated numerically for a selected problem with two conflicting objectives. The empirical results obtained by these experiments lead to the observation that only one of these step size rules may have the property to ensure convergence to the Pareto set. A theoretical analysis finally shows that a special version of an evolutionary algorithm with this step size rule converges with probability one to the Pareto set for the test problem under consideration.

Book ChapterDOI
01 Jan 1998
TL;DR: This chapter contains sections titled: References An Introduction to Simulated Evolutionary Optimization Evolutionary Computation: Comments on the History and Current State.
Abstract: This chapter contains sections titled: References An Introduction to Simulated Evolutionary Optimization Evolutionary Computation: Comments on the History and Current State

Journal ArticleDOI
TL;DR: A new type of artificial neural network (GasNets) is introduced and shows that it is possible to use evolutionary computing techniques to find robot controllers based on them and consistently achieved evolutionary success in far fewer evaluations than were needed when using more conventional connectionist style networks.
Abstract: This paper introduces a new type of artificial neural network (GasNets) and shows that it is possible to use evolutionary computing techniques to find robot controllers based on them. The controllers are built from networks inspired by the modulatory effects of freely diffusing gases, especially nitric oxide, in real neuronal networks. Evolutionary robotics techniques were used to develop control networks and visual morphologies to enable a robot to achieve a target discrimination task under very noisy lighting conditions. A series of evolutionary runs with and without the gas modulation active demonstrated that networks incorporating modulation by diffusing gases evolved to produce successful controllers considerably faster than networks without this mechanism. GasNets also consistently achieved evolutionary success in far fewer evaluations than were needed when using more conventional connectionist style networks.

Journal ArticleDOI
TL;DR: Experiments using evolutionary testing on a number of programs with up to 1511 LOC and 5000 input parameters have successfully identified new longer and shorter execution times than had been found using other testing techniques, and evolutionary testing seems to be a promising approach for the verification of timing constraints.
Abstract: Many industrial products are based on the use of embedded computer systems. Usually, these systems have to fulfil real-time requirements, and correct system functionality depends on their logical correctness as well as on their temporal correctness. In order to verify the temporal behavior of real-time systems, previous scientific work has, to a large extent, concentrated on static analysis techniques. Although these techniques offer the possibilty of providing safe estimates of temporal behavior for certain cases, there are a number of cases in practice for which static analysis can not be easily applied. Furthermore, no commercial tools for timing analysis of real-world programs are available. Therefore, the developed systems have to be thoroughly tested in order to detect existing deficiencies in temporal behavior, as well as to strengthen the confidence in temporal correctness. An investigation of existing test methods shows that they mostly concentrate on testing for logical correctness. They are not specialised in the examination of temporal correctness which is also essential to real-time systems. For this reason, existing test procedures must be supplemented by new methods which concentrate on determining whether the system violates its specified timing constraints. Normally, a violation means that outputs are produced too early, or their computation takes too long. The task of the tester therefore is to find the input situations with the longest or shortest execution times, in order to check whether they produce a temporal error. If the search for such inputs is interpreted as a problem of optimization, evolutionary computation can be used to automatically find the inputs with the longest or shortest execution times. This automatic search for accurate test data by means of evolutionary computation is called evolutionary testing. Experiments using evolutionary testing on a number of programs with up to 1511 LOC and 5000 input parameters have successfully identified new longer and shorter execution times than had been found using other testing techniques. Evolutionary testing, therefore, seems to be a promising approach for the verification of timing constraints. A combination of evolutionary testing and systematic testing offers further opportunities to improve the test quality, and could lead to an effective test strategy for real-time systems.

Book ChapterDOI
27 Sep 1998
TL;DR: This paper presents a new approach to function optimisation using a new variant of GAs, called the Stud GA, which instead of stochastic selection, the fittest individual, the Stud, shares its genetic information with all others using simple GA operators.
Abstract: This paper presents a new approach to function optimisation using a new variant of GAs. This algorithm is called the Stud GA. Instead of stochastic selection, the fittest individual, the Stud, shares its genetic information with all others using simple GA operators. The standard Gray coding is maintained. Simple techniques are added to maintain diversity of the population and help achieve the global optima in difficult multimodal search spaces. The benefits of this approach are an improved performance in terms of accuracy, efficiency and reliability. This approach appears to be able to deal with a wide array of functions and to give consistent repeatability of optimisation performance. A variety of test functions is used to illustrate this approach. Results presented suggest a viable and attractive addition to the portfolio of evolutionary computing techniques.

Proceedings ArticleDOI
04 May 1998
TL;DR: The results support the conclusions that the choice of a particular mechanism for self-adaptation can be critical in dynamic environments, and the lognormal rule utilized in evolution strategies is well suited for such kind of problems.
Abstract: The capability of evolution strategies and evolutionary programming to track the optimum in simple dynamic environments is investigated for different types of dynamics, update frequencies, and displacement strengths. Experimental results are reported for a (15100)-evolution strategy with lognormal self-adaptation, a standard evolutionary programming algorithm with multiplicative self-adaptation rule, and an evolutionary programming algorithm with lognormal self-adaptation. The evolution strategy and lognormal evolutionary programming prove their capability to track the dynamic optimum, while evolutionary programming with multiplicative self-adaptation rule fails in the dynamic environment. These results support the conclusions that the choice of a particular mechanism for self-adaptation can be critical in dynamic environments, and the lognormal rule utilized in evolution strategies is well suited for such kind of problems.

Journal ArticleDOI
TL;DR: Observations support the conclusion that noncoding regions serve as scratch space in which VIV can explore alternative gene values, a positive step in understanding how GAs might exploit more of the power and flexibility of biological evolution while simultaneously providing better tools for understanding evolving biological systems.
Abstract: The majority of current genetic algorithms (GAs), while inspired by natural evolutionary systems, are seldom viewed as biologically plausible models. This is not a criticism of GAs, but rather a reflection of choices made regarding the level of abstraction at which biological mechanisms are modeled, and a reflection of the more engineering-oriented goals of the evolutionary computation community. Understanding better and reducing this gap between GAs and genetics has been a central issue in an interdisciplinary project whose goal is to build GA-based computational models of viral evolution. The result is a system called Virtual Virus (VIV). VIV incorporates a number of more biologically plausible mechanisms, including a more flexible genotype-to-phenotype mapping. In VIV the genes are independent of position, and genomes can vary in length and may contain noncoding regions, as well as duplicative or competing genes. Initial computational studies with VIV have already revealed several emergent phenomena of both biological and computational interest. In the absence of any penalty based on genome length, VIV develops individuals with long genomes and also performs more poorly (from a problem-solving viewpoint) than when a length penalty is used. With a fixed linear length penalty, genome length tends to increase dramatically in the early phases of evolution and then decrease to a level based on the mutation rate. The plateau genome length (i.e., the average length of individuals in the final population) generally increases in response to an increase in the base mutation rate. When VIV converges, there tend to be many copies of good alternative genes within the individuals. We observed many instances of switching between active and inactive genes during the entire evolutionary process. These observations support the conclusion that noncoding regions serve as scratch space in which VIV can explore alternative gene values. These results represent a positive step in understanding how GAs might exploit more of the power and flexibility of biological evolution while simultaneously providing better tools for understanding evolving biological systems.

Journal ArticleDOI
TL;DR: It was found that the use of a cultural framework to support self-adaptation in Evolutionary Programming can produce substantial performance improvements over population-only systems as expressed in terms of systems success ratio, execution CPU time, and mean best solution for a given set of 34 function minimization problems.
Abstract: Cultural Algorithms are computational self-adaptive models which consist of a population and a belief space. The problem-solving experience of individuals selected from the population space by the acceptance function is generalized and stored in the belief space. This knowledge can then control the evolution of the population component by means of the influence function. Here, we examine the role that different forms of knowledge can play in the self-adaptation process within cultural systems. In particular, we compare various approaches that use normative and situational knowledge in different ways to guide the function optimization process. The results in this study demonstrate that Cultural Algorithms are a naturally useful framework for self-adaptation and that the use of a cultural framework to support self-adaptation in Evolutionary Programming can produce substantial performance improvements over population-only systems as expressed in terms of (1) systems success ratio, (2) execution CPU time, and (3) convergence (mean best solution) for a given set of 34 function minimization problems. The nature of these improvements and the type of knowledge that is most effective in producing them depend on the problem's functional landscape. In addition, it was found that the same held true for the population-only self-adaptive EP systems. Each level of self-adaptation (component, individual, and population) outperformed the others for problems with particular landscape features.

Posted Content
TL;DR: This paper summarises and draws out the implications of the Neo-Darwinian Synthesis for processes of social evolution and discusses the extent to which evolutionary algorithms capture the aspects of biological evolution which are relevant to social processes.
Abstract: This paper attempts to illustrate the importance of a coherent behavioural interpretation in applying evolutionary algorithms like Genetic Algorithms and Genetic Programming to the modelling of social processes. It summarises and draws out the implications of the Neo-Darwinian Synthesis for processes of social evolution and then discusses the extent to which evolutionary algorithms capture the aspects of biological evolution which are relevant to social processes. The paper uses several recent papers in the field as case studies, discussing more and less successful uses of evolutionary algorithms in social science. The key aspects of evolution discussed in the paper are that it is dependent on relative rather than absolute fitness, it does not require global knowledge or a system level teleology, it avoids the credit assignment problem, it does not exclude Lamarckian inheritance and it is both progressive and open ended.

Journal ArticleDOI
TL;DR: The evolutionary algorithm based on a parallel diffusion model and extended for mixed-integer optimization was able to compete with or even outperform traditional methods of robust MOC design and is easily adopted to other application domains.
Abstract: Robustness is an important requirement for almost all kinds of products. This article shows how evolutionary algorithms can be applied for robust design based on the approach of Taguchi. To achieve a better understanding of the consequences of this approach, we first present some analytical results gained from a toy problem. As a nontrivial industrial application we consider the design of multilayer optical coatings (MOCs) most frequently used for optical filters. An evolutionary algorithm based on a parallel diffusion model and extended for mixed-integer optimization was able to compete with or even outperform traditional methods of robust MOC design. With respect to chromaticity, the MOC designs found by the evolutionary algorithm are substantially more robust to parameter variations than a reference design and therefore perform much better in the average case. In most cases, however, this advantage has to be paid for by a reduction in the average reflectance. The robust design approach outlined in this paper should be easily adopted to other application domains.

Journal ArticleDOI
TL;DR: A GA based algorithm for solving the single-floor facility layout problem and the results indicate that GA may provide a better alternative in a realistic environment where the objective is to find a number of “reasonably good” layouts.

Book
01 Jan 1998
TL;DR: The Optimisation of Multi-variate Robust Design Criteria and Case Injected Genetic Algorithm Design of Combinational Logic Circuits and the Fuzzy Clustering Evolution Strategy.
Abstract: Chapter 1 Plenary Papers Pareto Solutions of Multipoint Design of Supersonic Wings Using Evolutionary Algorithms Notes on Design Through Artificial Evolution: Opportunities and Algorithms Testing, Evaluation and Performance of Optimisation and Learning Systems From Evolutionary Computation to Natural Computation Chapter 2 Engineering Design ApplicationsExperiences with Hybrid Evolutionary/Local Optimisation for Process Design The Development of a Grid-based Engineering Design Problem Solving EnvironmentA Hybrid Search Technique for Inverse Transient Analysis in Water Distribution SystemsOptimisation of Thermal Power Plant Designs: A Graph-based Adaptive Search Approach Genetic Algorithm Search for Stent Design Improvements A Multi-objective Optimisation Approach for the Conceptual Design of Frame Structures Multi-Objective Evolutionary Topological Optimum Design Better Surface Intersections by Constrained EvolutionInverse Identification of Boundary Constants for Electronic Packages Using Modified Micro-genetic Algorithm and the Reduced-basis Method Extrinsic Evolution of Finite State Machines Chapter 3 Manufacturing Processes Neural Computing Approach to Shape Change Estimation in Hot Isostatic PressingMulti-criterion Tackling Bottleneck Machines and Exceptional Parts in Cell Formation Using Genetic Algorithms A New Approach to Packing Non-Convex Polygons Using the No Fit Polygon and Meta-Heuristic and Evolutionary Algorithms Chapter 4 System/Process Control Evolutionary Multi-criteria Optimisation for Improved Design of Optimal Control Systems Explorations in Fuzzy Classifier System Architectures Evolving Temporal Rules with the Delayed Action Classifier System - Analysis and New Results Adaptive Image Segmentation Based on Visual Interactive Feedback LearningChapter 5 Strategy/Algorithm DevelopmentAdapting Problem Specifications and Design Solutions Using Co-evolution Handling Constraints in Genetic Algorithms Using Dominance-based TournamentsThe Optimisation of Multi-variate Robust Design CriteriaLearning from Experience: Case Injected Genetic Algorithm Design of Combinational Logic Circuits Constrained Optimisation with the Fuzzy Clustering Evolution StrategyConstrained Optimisation Using an Evolutionary Programming-based Cultural Algorithm A Data Mining Tool Using an Intelligent Processing System with a Clustering Application Chapter 6 Multiple Objectives, Preferences and Agent-Support Full Elite Sets for Multi-objective Optimisation Agent-based Support within an Interactive Evolutionary Design SystemA Multi-agent Architecture for Business Process Management Adapts to Unreliable Performance Real-time Co-ordinated Scheduling Using a Genetic Algorithm

Journal ArticleDOI
TL;DR: A new approach to the construction of neural networks based on evolutionary computation is presented, where a linear chromosome combined to a graph representation of the network are used by genetic operators, which allow the evolution of the architecture and the weights simultaneously without the need of local weight optimization.
Abstract: Evolutionary computation is a class of global search techniques based on the learning process of a population of potential solutions to a given problem, that has been successfully applied to a variety of problems. In this paper a new approach to the construction of neural networks based on evolutionary computation is presented. A linear chromosome combined to a graph representation of the network are used by genetic operators, which allow the evolution of the architecture and the weights simultaneously without the need of local weight optimization. This paper describes the approach, the operators and reports results of the application of this technique to several binary classification problems.

Journal ArticleDOI
Thomas Bäck1
TL;DR: This survey paper provides an overview of the existing techniques for the self-adaptation of strategy parameters related to mutation and recombination operators, indicating that the principle works under a variety of conditions regarding the search space of the underlying optimization problem and the method used for the variation of strategy parameter parameters.
Abstract: The principle of self-adaptation in evolutionary algorithms is an important mechanism for controlling the strategy parameters of such algorithms by evolving parameter values in analogy with the usual evolution of object variables. To facilitate evolution of strategy parameters, they are incorporated into the representation of individuals and are subject to the evolutionary variation operators in a similar way as the object variables. This survey paper provides an overview of the existing techniques for the self-adaptation of strategy parameters related to mutation and recombination operators, indicating that the principle works under a variety of conditions regarding the search space of the underlying optimization problem and the method used for the variation of strategy parameters. Although a number of open questions remain, self-adaptation is identified as a generally applicable, robust and efficient method for parameter control in evolutionary algorithms.

Book
01 Jan 1998
TL;DR: Artificial Evolution: How and Why?
Abstract: Artificial Evolution: How and Why? (H.-P Schwefel & T. Bdck) Adaptive Niching via Coevolutionary Sharing (D. Goldberg & L. Wang) Representation Issues in Neighbourhood Search and Evolutionary Algorithms (D. Whitley, et al.) Gene Expression: The Missing Link in Evolutionary Computation (H. Kargupta) Immunized Artificial Systems Concepts and Applications (K. Krishnakumar & J. Neidhoefer) Designing Electronic Circuits Using Eolutionary Algorithms Arithmetic Circuits: A Case Study (J. Miller, et al.) Evolutionary Computing for Conceptual and Detailed Design (I. Parmee) Cam Shape Optimization by Genetic Algorithms (J. Alander & J. Lampinen) Evolutionary Algorithms: Applications at the Informatik Center Dortmund (T. Bdck, et al.) Evolutionary Learning Processes for Data Analysis in Electrical Engineering Applications (O. Cordsn, et al.) Ga Multiple Objective Optimization Strategies for Electromagnetic Backscattering (J. Piriaux, et al.) Pareto Genetic Algorithm for Aerodynamic Design Using the Navier-Stokes Equations (S. Obayashi) GA Coupled with Computationally Expensive Simulations: Tools to Improve Efficiency (G. Poloni & V. Pediroda) Coupling Genetic Algorithms and Gradient Based Optimization Techniques (D. Quagliarella & A. Vicini) Evolutionary Synthesis of Control Policies for Manufacturing Systems (B. Porter) Parametric and Non-Parametric Identification of Macro-Mechanical Models (M. Sebag, et al.) Evolutionary Mobile Robotics (D. Floreano) Nonlinear System Identification by Means of Evolutionary Optimised Neural Networks (I. De Falco).

Journal ArticleDOI
TL;DR: In this paper, the authors study a Markovian evolutionary process which encompasses the classical simple genetic algorithm and show how a delicate interaction between the perturbations and the selection pressure may force the convergence toward the global maxima of the fitness function.
Abstract: We study a markovian evolutionary process which encompasses the classical simple genetic algorithm. This process is obtained by randomly perturbing a very simple selection scheme. Using the Freidlin-Wentzell theory, we carry out a precise study of the asymptotic dynamics of the process as the perturbations disappear. We show how a delicate interaction between the perturbations and the selection pressure may force the convergence toward the global maxima of the fitness function. We put forward the existence of a critical population size, above which this kind of convergence can be achieved. We compute upper bounds of this critical population size for several examples. We derive several conditions to ensure convergence in the homogeneous case and these provide the first mathematically well-founded convergence results for genetic algorithms.

Proceedings ArticleDOI
04 May 1998
TL;DR: This paper introduces a new element to evolutionary algorithms for constrained parameter optimization problems: the parent matching mechanism and shows that the proposed technique works very well on selected test cases.
Abstract: During the last few years, several methods have been proposed for handling constraints by evolutionary algorithms for parameter optimisation problems. These methods include those based on penalty functions, preservation of feasibility, decoders and repair algorithms, as well as some hybrid techniques. Most of these techniques have serious drawbacks (some of them may return infeasible solutions, others require many additional parameters, etc.). Moreover, none of these techniques has utilized knowledge about which constraints are satisfied and which are not. In this paper, we introduce a new element to evolutionary algorithms for constrained parameter optimization problems: the parent matching mechanism. The preliminary results show that the proposed technique works very well on selected test cases.

Journal ArticleDOI
TL;DR: It is shown that the use of an evolutionary algorithm offers advantages over other approaches, including a high rate of global convergence and the ability to handle discrete variables.
Abstract: This paper describes the application of an evolutionary algorithm to the design of induction motors. It is shown that the use of an evolutionary algorithm offers advantages over other approaches. These include a high rate of global convergence and the ability to handle discrete variables.

Proceedings ArticleDOI
04 May 1998
TL;DR: A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs is given and it is found that the expected run time of this algorithm is at most /spl Theta/(n ln n) forlinear functions with n variables.
Abstract: Evolutionary algorithms (EAs) are heuristic randomized algorithms which, by many impressive experiments, have been proven to behave quite well for optimization problems of various kinds. In this paper, a rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs is given. The analysis is carried out for different mutation rates. The main contribution of the paper is not the result that the expected run time of the (1+1) evolutionary algorithm is at most /spl Theta/(n ln n) for linear functions with n variables, but the presentation of methods showing how this result can be proven rigorously.