scispace - formally typeset
Search or ask a question

Showing papers on "Evolutionary computation published in 2005"


Journal ArticleDOI
TL;DR: This paper attempts to provide a comprehensive overview of the related work within a unified framework on addressing different uncertainties in evolutionary computation, which has been scattered in a variety of research areas.
Abstract: Evolutionary algorithms often have to solve optimization problems in the presence of a wide range of uncertainties. Generally, uncertainties in evolutionary computation can be divided into the following four categories. First, the fitness function is noisy. Second, the design variables and/or the environmental parameters may change after optimization, and the quality of the obtained optimal solution should be robust against environmental changes or deviations from the optimal point. Third, the fitness function is approximated, which means that the fitness function suffers from approximation errors. Fourth, the optimum of the problem to be solved changes over time and, thus, the optimizer should be able to track the optimum continuously. In all these cases, additional measures must be taken so that evolutionary algorithms are still able to work satisfactorily. This paper attempts to provide a comprehensive overview of the related work within a unified framework, which has been scattered in a variety of research areas. Existing approaches to addressing different uncertainties are presented and discussed, and the relationship between the different categories of uncertainties are investigated. Finally, topics for future research are suggested.

1,528 citations


Journal ArticleDOI
TL;DR: Comparisons among the formulation and results of five recent evolutionary-based algorithms: genetic algorithms, memetic algorithms, particle swarm, ant-colony systems, and shuffled frog leaping are compared.

1,268 citations


Journal ArticleDOI
Yaochu Jin1
01 Jan 2005
TL;DR: A comprehensive survey of the research on fitness approximation in evolutionary computation is presented, main issues like approximation levels, approximate model management schemes, model construction techniques are reviewed and open questions and interesting issues in the field are discussed.
Abstract: Evolutionary algorithms (EAs) have received increasing interests both in the academy and industry. One main difficulty in applying EAs to real-world applications is that EAs usually need a large number of fitness evaluations before a satisfying result can be obtained. However, fitness evaluations are not always straightforward in many real-world applications. Either an explicit fitness function does not exist, or the evaluation of the fitness is computationally very expensive. In both cases, it is necessary to estimate the fitness function by constructing an approximate model. In this paper, a comprehensive survey of the research on fitness approximation in evolutionary computation is presented. Main issues like approximation levels, approximate model management schemes, model construction techniques are reviewed. To conclude, open questions and interesting issues in the field are discussed.

1,228 citations


Proceedings ArticleDOI
12 Dec 2005
TL;DR: A novel self-adaptive differential evolution algorithm (SaDE), where the choice of learning strategy and the two control parameters F and CR are not required to be pre-specified.
Abstract: In this paper, we propose a novel self-adaptive differential evolution algorithm (SaDE), where the choice of learning strategy and the two control parameters F and CR are not required to be pre-specified. During evolution, the suitable learning strategy and parameter settings are gradually self-adapted according to the learning experience. The performance of the SaDE is reported on the set of 25 benchmark functions provided by CEC2005 special session on real parameter optimization.

1,112 citations


Journal ArticleDOI
Bo Liu1, Ling Wang1, Yihui Jin1, Fang Tang2, Dexian Huang1 
TL;DR: Simulation results and comparisons with the standard PSO and several meta-heuristics show that the CPSO can effectively enhance the searching efficiency and greatly improve the searching quality.
Abstract: As a novel optimization technique, chaos has gained much attention and some applications during the past decade. For a given energy or cost function, by following chaotic ergodic orbits, a chaotic dynamic system may eventually reach the global optimum or its good approximation with high probability. To enhance the performance of particle swarm optimization (PSO), which is an evolutionary computation technique through individual improvement plus population cooperation and competition, hybrid particle swarm optimization algorithm is proposed by incorporating chaos. Firstly, adaptive inertia weight factor (AIWF) is introduced in PSO to efficiently balance the exploration and exploitation abilities. Secondly, PSO with AIWF and chaos are hybridized to form a chaotic PSO (CPSO), which reasonably combines the population-based evolutionary searching ability of PSO and chaotic searching behavior. Simulation results and comparisons with the standard PSO and several meta-heuristics show that the CPSO can effectively enhance the searching efficiency and greatly improve the searching quality.

879 citations


Journal ArticleDOI
TL;DR: In this article, a multiobjective formulation for the siting and sizing of DG resources into existing distribution networks is proposed, which permits the planner to decide the best compromise between cost of network upgrading, cost of power losses, and cost of energy not supplied.
Abstract: In the restructured electricity industry, the engineering aspects of planning need to be reformulated even though the goal to attain remains substantially the same, requiring various objectives to be simultaneously accomplished to achieve the optimality of the power system development and operation. In many cases, these objectives contradict each other and cannot be handled by conventional single optimization techniques. In this paper, a multiobjective formulation for the siting and sizing of DG resources into existing distribution networks is proposed. The methodology adopted permits the planner to decide the best compromise between cost of network upgrading, cost of power losses, cost of energy not supplied, and cost of energy required by the served customers. The implemented technique is based on a genetic algorithm and an /spl epsiv/-constrained method that allows obtaining a set of noninferior solutions. Application examples are presented to demonstrate the effectiveness of the proposed procedure.

767 citations


Journal ArticleDOI
TL;DR: This paper reviews some works on the application of MAs to well-known combinatorial optimization problems, and places them in a framework defined by a general syntactic model, which provides them with a classification scheme based on a computable index D, which facilitates algorithmic comparisons and suggests areas for future research.
Abstract: The combination of evolutionary algorithms with local search was named "memetic algorithms" (MAs) (Moscato, 1989). These methods are inspired by models of natural systems that combine the evolutionary adaptation of a population with individual learning within the lifetimes of its members. Additionally, MAs are inspired by Richard Dawkin's concept of a meme, which represents a unit of cultural evolution that can exhibit local refinement (Dawkins, 1976). In the case of MA's, "memes" refer to the strategies (e.g., local refinement, perturbation, or constructive methods, etc.) that are employed to improve individuals. In this paper, we review some works on the application of MAs to well-known combinatorial optimization problems, and place them in a framework defined by a general syntactic model. This model provides us with a classification scheme based on a computable index D, which facilitates algorithmic comparisons and suggests areas for future research. Also, by having an abstract model for this class of metaheuristics, it is possible to explore their design space and better understand their behavior from a theoretical standpoint. We illustrate the theoretical and practical relevance of this model and taxonomy for MAs in the context of a discussion of important design issues that must be addressed to produce effective and efficient MAs.

719 citations


Journal ArticleDOI
TL;DR: The results obtained from the computational study have shown that the proposed algorithm is a viable and effective approach for the multi-objective FJSP, especially for problems on a large scale.

639 citations


Proceedings ArticleDOI
12 Dec 2005
TL;DR: GDE3 improves earlier GDE versions in the case of multi-objective problems by giving a better distributed solution and is demonstrated with a set of test problems and is compared with other methods.
Abstract: A developed version of generalized differential evolution, GDE3, is proposed. GDE3 is an extension of differential evolution (DE) for global optimization with an arbitrary number of objectives and constraints. In the case of a problem with a single objective and without constraints GDE3 falls back to the original DE. GDE3 improves earlier GDE versions in the case of multi-objective problems by giving a better distributed solution. Performance of GDE3 is demonstrated with a set of test problems and the results are compared with other methods

589 citations


Journal ArticleDOI
TL;DR: The proposed approach to solve global nonlinear optimization problems uses a simple diversity mechanism based on allowing infeasible solutions to remain in the population to find the global optimum despite reaching reasonably fast the feasible region of the search space.
Abstract: This work presents a simple multimembered evolution strategy to solve global nonlinear optimization problems. The approach does not require the use of a penalty function. Instead, it uses a simple diversity mechanism based on allowing infeasible solutions to remain in the population. This technique helps the algorithm to find the global optimum despite reaching reasonably fast the feasible region of the search space. A simple feasibility-based comparison mechanism is used to guide the process toward the feasible region of the search space. Also, the initial stepsize of the evolution strategy is reduced in order to perform a finer search and a combined (discrete/intermediate) panmictic recombination technique improves its exploitation capabilities. The approach was tested with a well-known benchmark. The results obtained are very competitive when comparing the proposed approach against other state-of-the art techniques and its computational cost (measured by the number of fitness function evaluations) is lower than the cost required by the other techniques compared.

585 citations


Proceedings ArticleDOI
12 Dec 2005
TL;DR: This study reports how the differential evolution (DE) algorithm performed on the test bed developed for the CEC05 contest for real parameter optimization.
Abstract: This study reports how the differential evolution (DE) algorithm performed on the test bed developed for the CEC05 contest for real parameter optimization The test bed includes 25 scalable functions, many of which are both non-separable and highly multi-modal Results include DE's performance on the 10 and 30-dimensional versions of each function

Journal ArticleDOI
01 May 2005
TL;DR: Why and when the multiobjective approach to constraint handling is expected to work or fail is analyzed and an improved evolutionary algorithm based on evolution strategies and differential variation is proposed.
Abstract: A common approach to constraint handling in evolutionary optimization is to apply a penalty function to bias the search toward a feasible solution. It has been proposed that the subjective setting of various penalty parameters can be avoided using a multiobjective formulation. This paper analyzes and explains in depth why and when the multiobjective approach to constraint handling is expected to work or fail. Furthermore, an improved evolutionary algorithm based on evolution strategies and differential variation is proposed. Extensive experimental studies have been carried out. Our results reveal that the unbiased multiobjective approach to constraint handling may not be as effective as one may have assumed.

Proceedings ArticleDOI
06 Nov 2005
TL;DR: A general formulation of MO optimization is given in this chapter, the Pareto optimality concepts introduced, and solution approaches with examples of MO problems in the power systems field are given.
Abstract: The goal of this chapter is to give fundamental knowledge on solving multi-objective optimization problems. The focus is on the intelligent metaheuristic approaches (evolutionary algorithms or swarm-based techniques). The focus is on techniques for efficient generation of the Pareto frontier. A general formulation of MO optimization is given in this chapter, the Pareto optimality concepts introduced, and solution approaches with examples of MO problems in the power systems field are given

Proceedings ArticleDOI
08 Jun 2005
TL;DR: This paper identifies shortcomings associated with the existing test functions of novel hybrid benchmark functions, whose complexity and properties can be controlled easily, are introduced and several evolutionary algorithms are evaluated with the novel test functions.
Abstract: In the evolutionary optimization field, there exist some algorithms taking advantage of the known property of the benchmark functions, such as local optima lying along the coordinate axes, global optimum having the same values for many variables and so on. Multiagent genetic algorithm (MAGA) is an example for this class of algorithms. In this paper, we identify shortcomings associated with the existing test functions. Novel hybrid benchmark functions, whose complexity and properties can be controlled easily, are introduced and several evolutionary algorithms are evaluated with the novel test functions.

Journal ArticleDOI
TL;DR: An extensive study of evolutionary computation in the context of structural design has been conducted in the Information Technology and Engineering School at George Mason University and its results are reported here.

Book
23 May 2005
TL;DR: 1. Understanding natural selection 2. Underlying mathematics and philosophy 3. The Darwinian game 4. G-functions for the Darwiniangame 5. Darwinian dynamics 6. Evolutionary stable strategies 7. The ESS maximum principle.
Abstract: 1. Understanding natural selection 2. Underlying mathematics and philosophy 3. The Darwinian game 4. G-functions for the Darwinian game 5. Darwinian dynamics 6. Evolutionary stable strategies 7. The ESS maximum principle 8. Speciation and extinction 9. Matrix games 10. Evolutionary ecology 11. Managing evolving systems.

Proceedings ArticleDOI
25 Jun 2005
TL;DR: Two new, improved variants of differential evolution are presented, shown to be statistically significantly better on a seven-function test bed for the following performance measures: solution quality, time to find the solution, frequency of finding the solutions, and scalability.
Abstract: Differential evolution (DE) is well known as a simple and efficient scheme for global optimization over continuous spaces. In this paper we present two new, improved variants of DE. Performance comparisons of the two proposed methods are provided against (a) the original DE, (b) the canonical particle swarm optimization (PSO), and (c) two PSO-variants. The new DE-variants are shown to be statistically significantly better on a seven-function test bed for the following performance measures: solution quality, time to find the solution, frequency of finding the solution, and scalability.

Book ChapterDOI
01 Jan 2005
TL;DR: In this introductory chapter, some fundamental concepts of multiobjective optimization are introduced, emphasizing the motivation and advantages of using evolutionary algorithms.
Abstract: Very often real-world applications have several multiple conflicting objectives. Recently there has been a growing interest in evolutionary multiobjective optimization algorithms that combine two major disciplines: evolutionary computation and the theoretical frameworks of multicriteria decision making. In this introductory chapter, some fundamental concepts of multiobjective optimization are introduced, emphasizing the motivation and advantages of using evolutionary algorithms. We then lay out the important contributions of the remaining chapters of this volume.

Book
01 Jan 2005
TL;DR: This paper presents a simple approach to evolutionary multi-objective optimization, using the PS-EA algorithm for multi-Criteria Optimization of Finite State Automata.
Abstract: Evolutionary Multiobjective Optimization Recent Trends in Evolutionary Multiobjective Optimization Self-adaptation and Convergence of Multiobjective Evolutionary Algorithms in Continuous Search Spaces A simple approach to evolutionary multi-objective optimization Quad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism Scalable Test Problems for Evolutionary Multi-Objective Optimization Particle Swarm Inspired Evolutionary Algorithm (PS-EA) for Multi-Criteria Optimization Problems Evolving Continuous Pareto Regions MOGADES: Multi-Objective Genetic Algorithm with Distributed Environment Scheme Use of Multiobjective Optimization Concepts to Handle Constraints in Genetic Algorithms Multi- Criteria Optimization of Finite State Automata: Maximizing Performance while Minimizing Description Length Multi-objective Optimization of Space Structures under Static and Seismic Loading Conditions

Journal ArticleDOI
01 May 2005
TL;DR: The Gaussian process model is described and proposed using it as an inexpensive fitness function surrogate and clearly outperforms other evolutionary strategies on standard test functions as well as on a real-world problem: the optimization of stationary gas turbine compressor profiles.
Abstract: We present an overview of evolutionary algorithms that use empirical models of the fitness function to accelerate convergence, distinguishing between evolution control and the surrogate approach. We describe the Gaussian process model and propose using it as an inexpensive fitness function surrogate. Implementation issues such as efficient and numerically stable computation, exploration versus exploitation, local modeling, multiple objectives and constraints, and failed evaluations are addressed. Our resulting Gaussian process optimization procedure clearly outperforms other evolutionary strategies on standard test functions as well as on a real-world problem: the optimization of stationary gas turbine compressor profiles.

Book
01 Jul 2005
TL;DR: Part I: Introduction Computational Intelligence: An Introduction Traditional Problem Definition Part II: Basic Intelligent Computational Technologies Neural Networks Approach Fuzzy Logic Approach Evolutionary Computation Part III: Hybrid Computational technologies Neuro-fuzzy Approach Transparent FuzzY/Neuro-fBuzzy Modeling Evolving Neural and Fuzzed Systems Adaptive Genetic Algorithms Part IV: Recent Developments
Abstract: Part I: Introduction Computational Intelligence: An Introduction Traditional Problem Definition Part II: Basic Intelligent Computational Technologies Neural Networks Approach Fuzzy Logic Approach Evolutionary Computation Part III: Hybrid Computational Technologies Neuro-fuzzy Approach Transparent Fuzzy/Neuro-fuzzy Modeling Evolving Neural and Fuzzy Systems Adaptive Genetic Algorithms Part IV: Recent Developments The State of the Art and Development Trends

Journal ArticleDOI
TL;DR: Using a simplified but still realistic evolutionary algorithm, a thorough analysis of the effects of the offspring population size is presented and a simple way to dynamically adapt this parameter when necessary is suggested.
Abstract: Evolutionary algorithms (EAs) generally come with a large number of parameters that have to be set before the algorithm can be used. Finding appropriate settings is a difficult task. The influence of these parameters on the efficiency of the search performed by an evolutionary algorithm can be very high. But there is still a lack of theoretically justified guidelines to help the practitioner find good values for these parameters. One such parameter is the offspring population size. Using a simplified but still realistic evolutionary algorithm, a thorough analysis of the effects of the offspring population size is presented. The result is a much better understanding of the role of offspring population size in an EA and suggests a simple way to dynamically adapt this parameter when necessary.

Journal Article
TL;DR: A parallel version of the particle swarm optimization (PPSO) algorithm together with three communication strategies which can be used according to the independence of the data, which demonstrates the usefulness of the proposed PPSO algorithm.
Abstract: Particle swarm optimization (PSO) is an alternative population-based evolutionary computation technique. It has been shown to be capable of optimizing hard mathematical problems in continuous or binary space. We present here a parallel version of the particle swarm optimization (PPSO) algorithm together with three communication strategies which can be used according to the independence of the data. The first strategy is designed for solution parameters that are independent or are only loosely correlated, such as the Rosenbrock and Rastrigrin functions. The second communication strategy can be applied to parameters that are more strongly correlated such as the Griewank function. In cases where the properties of the parameters are unknown, a third hybrid communication strategy can be used. Experimental results demonstrate the usefulness of the proposed PPSO algorithm.

Journal ArticleDOI
TL;DR: This paper proposes a general framework for designing neural network ensembles by means of cooperative coevolution, and applies the proposed model to ten real-world classification problems of a very different nature from the UCI machine learning repository and proben1 benchmark set.
Abstract: This paper presents a cooperative coevolutive approach for designing neural network ensembles. Cooperative coevolution is a recent paradigm in evolutionary computation that allows the effective modeling of cooperative environments. Although theoretically, a single neural network with a sufficient number of neurons in the hidden layer would suffice to solve any problem, in practice many real-world problems are too hard to construct the appropriate network that solve them. In such problems, neural network ensembles are a successful alternative. Nevertheless, the design of neural network ensembles is a complex task. In this paper, we propose a general framework for designing neural network ensembles by means of cooperative coevolution. The proposed model has two main objectives: first, the improvement of the combination of the trained individual networks; second, the cooperative evolution of such networks, encouraging collaboration among them, instead of a separate training of each network. In order to favor the cooperation of the networks, each network is evaluated throughout the evolutionary process using a multiobjective method. For each network, different objectives are defined, considering not only its performance in the given problem, but also its cooperation with the rest of the networks. In addition, a population of ensembles is evolved, improving the combination of networks and obtaining subsets of networks to form ensembles that perform better than the combination of all the evolved networks. The proposed model is applied to ten real-world classification problems of a very different nature from the UCI machine learning repository and proben1 benchmark set. In all of them the performance of the model is better than the performance of standard ensembles in terms of generalization error. Moreover, the size of the obtained ensembles is also smaller.

Journal ArticleDOI
TL;DR: In this paper, a new particle swarm optimization (PSO) approach to identify the autoregressive moving average with exogenous variable (ARMAX) model for one-day to one-week ahead hourly load forecasts was proposed.
Abstract: In this paper, a new particle swarm optimization (PSO) approach to identifying the autoregressive moving average with exogenous variable (ARMAX) model for one-day to one-week ahead hourly load forecasts was proposed Owing to the inherent nonlinear characteristics of power system loads, the surface of the forecasting error function possesses many local minimum points Solutions of the gradient search-based stochastic time series (STS) technique may, therefore, stall at the local minimum points, which lead to an inadequate model By simulating a simplified social system, the PSO algorithm offers the capability of converging toward the global minimum point of a complex error surface The proposed PSO has been tested on the different types of Taiwan Power (Taipower) load data and compared with the evolutionary programming (EP) algorithm and the traditional STS method Testing results indicate that the proposed PSO has high-quality solution, superior convergence characteristics, and shorter computation time

Journal ArticleDOI
TL;DR: An evolutionary algorithm with guided mutation (EA/G) for the maximum clique problem is proposed in this paper and experimental results show that EA/G outperforms the heuristic genetic algorithm of Marchiori and a MIMIC algorithm on DIMACS benchmark graphs.
Abstract: Estimation of distribution algorithms sample new solutions (offspring) from a probability model which characterizes the distribution of promising solutions in the search space at each generation. The location information of solutions found so far (i.e., the actual positions of these solutions in the search space) is not directly used for generating offspring in most existing estimation of distribution algorithms. This paper introduces a new operator, called guided mutation. Guided mutation generates offspring through combination of global statistical information and the location information of solutions found so far. An evolutionary algorithm with guided mutation (EA/G) for the maximum clique problem is proposed in this paper. Besides guided mutation, EA/G adopts a strategy for searching different search areas in different search phases. Marchiori's heuristic is applied to each new solution to produce a maximal clique in EA/G. Experimental results show that EA/G outperforms the heuristic genetic algorithm of Marchiori (the best evolutionary algorithm reported so far) and a MIMIC algorithm on DIMACS benchmark graphs.

Journal ArticleDOI
TL;DR: This paper presents a general framework covering most optimization scenarios and shows that in self-play there are free lunches: in coevolution some algorithms have better performance than other algorithms, averaged across all possible problems.
Abstract: Recent work on the foundational underpinnings of black-box optimization has begun to uncover a rich mathematical structure. In particular, it is now known that an inner product between the optimization algorithm and the distribution of optimization problems likely to be encountered fixes the distribution over likely performances in running that algorithm. One ramification of this is the "No Free Lunch" (NFL) theorems, which state that any two algorithms are equivalent when their performance is averaged across all possible problems. This highlights the need for exploiting problem-specific knowledge to achieve better than random performance. In this paper, we present a general framework covering most optimization scenarios. In addition to the optimization scenarios addressed in the NFL results, this framework covers multiarmed bandit problems and evolution of multiple coevolving players. As a particular instance of the latter, it covers "self-play" problems. In these problems, the set of players work together to produce a champion, who then engages one or more antagonists in a subsequent multiplayer game. In contrast to the traditional optimization case where the NFL results hold, we show that in self-play there are free lunches: in coevolution some algorithms have better performance than other algorithms, averaged across all possible problems. However, in the typical coevolutionary scenarios encountered in biology, where there is no champion, the NFL theorems still hold.

Book
01 Jan 2005
TL;DR: In this paper, the authors set the stage for Structured Populations by using island models, lattice cellular models, and Coevolutionary Structured Models (CSMs).
Abstract: Setting the Stage for Structured Populations.- Island Models.- Island Models: Empirical Properties.- Lattice Cellular Models.- Lattice Cellular Models: Empirical Properties.- Random and Irregular Cellular Populations.- Coevolutionary Structured Models.- Some Nonconventional Models.

Proceedings ArticleDOI
Xiaodong Li1
25 Jun 2005
TL;DR: In this paper, a species-based DE (SDE) is proposed to locate multiple global optima simultaneously through adaptive formation of multiple species (or subpopulations) in an DE population at each iteration step.
Abstract: In this paper differential evolution is extended by using the notion of speciation for solving multimodal optimization problems. The proposed species-based DE (SDE) is able to locate multiple global optima simultaneously through adaptive formation of multiple species (or subpopulations) in an DE population at each iteration step. Each species functions as an DE by itself. Successive local improvements through species formation can eventually transform into global improvements in identifying multiple global optima. In this study the performance of SDE is compared with another recently proposed DE variant CrowdingDE. The computational complexity of SDE, the effect of population size and species radius on SDE are investigated. SDE is found to be more computationally efficient than CrowdingDE over a number of benchmark multimodal test functions.

Journal ArticleDOI
TL;DR: The experimental results show that the evolutionary sequence design by NACST/Seq outperforms in its reliability the existing sequence design techniques such as conventional EAs, simulated annealing, and specialized heuristic methods.
Abstract: DNA computing relies on biochemical reactions of DNA molecules and may result in incorrect or undesirable computations. Therefore, much work has focused on designing the DNA sequences to make the molecular computation more reliable. Sequence design involves with a number of heterogeneous and conflicting design criteria and traditional optimization methods may face difficulties. In this paper, we formulate the DNA sequence design as a multiobjective optimization problem and solve it using a constrained multiobjective evolutionary algorithm (EA). The method is implemented into the DNA sequence design system, NACST/Seq, with a suite of sequence-analysis tools to help choose the best solutions among many alternatives. The performance of NACST/Seq is compared with other sequence design methods, and analyzed on a traveling salesman problem solved by bio-lab experiments. Our experimental results show that the evolutionary sequence design by NACST/Seq outperforms in its reliability the existing sequence design techniques such as conventional EAs, simulated annealing, and specialized heuristic methods.