scispace - formally typeset
Search or ask a question

Showing papers on "Evolutionary computation published in 2000"


Journal ArticleDOI
TL;DR: A novel approach to balance objective and penalty functions stochastically, i.e., stochastic ranking, is introduced, and a new view on penalty function methods in terms of the dominance of penalty and objective functions is presented.
Abstract: Penalty functions are often used in constrained optimization. However, it is very difficult to strike the right balance between objective and penalty functions. This paper introduces a novel approach to balance objective and penalty functions stochastically, i.e., stochastic ranking, and presents a new view on penalty function methods in terms of the dominance of penalty and objective functions. Some of the pitfalls of naive penalty methods are discussed in these terms. The new ranking method is tested using a (/spl mu/, /spl lambda/) evolution strategy on 13 benchmark problems. Our results show that suitable ranking alone (i.e., selection), without the introduction of complicated and specialized variation operators, is capable of improving the search performance significantly.

1,571 citations


Proceedings ArticleDOI
01 Jan 2000
TL;DR: This paper summarizes the research on population-based probabilistic search algorithms based on modeling promising solutions by estimating their probability distribution and using the constructed model to guide the exploration of the search space.
Abstract: Summarizes the research on population-based probabilistic search algorithms based on modeling promising solutions by estimating their probability distribution and using the constructed model to guide the exploration of the search space. It settles the algorithms in the field of genetic and evolutionary computation where they have been originated. All methods are classified into a few classes according to the complexity of the class of models they use. Algorithms from each of these classes are briefly described and their strengths and weaknesses are discussed.

824 citations



Proceedings ArticleDOI
Kuk-Hyun Han1, Jong-Hwan Kim1
16 Jul 2000
TL;DR: The results show that GQA is superior to other genetic algorithms using penalty functions, repair methods and decoders and can represent a linear superposition of solutions due to its probabilistic representation.
Abstract: This paper proposes a novel evolutionary computing method called a genetic quantum algorithm (GQA). GQA is based on the concept and principles of quantum computing such as qubits and superposition of states. Instead of binary, numeric, or symbolic representation, by adopting qubit chromosome as a representation GQA can represent a linear superposition of solutions due to its probabilistic representation. As genetic operators, quantum gates are employed for the search of the best solution. Rapid convergence and good global search capability characterize the performance of GQA. The effectiveness and the applicability of GQA are demonstrated by experimental results on the knapsack problem, which is a well-known combinatorial optimization problem. The results show that GQA is superior to other genetic algorithms using penalty functions, repair methods and decoders.

622 citations


Proceedings ArticleDOI
16 Jul 2000
TL;DR: In this article, the authors define and execute a quantitative MOEA performance comparison methodology and present results from its execution with four MOEAs, and describe the results of their experiments.
Abstract: Solving optimization problems with multiple (often conflicting) objectives is generally a quite difficult goal. Evolutionary algorithms (EAs) were initially extended and applied during the mid-eighties in an attempt to stochastically solve problems of this generic class. During the past decade a multiplicity of multiobjective EA (MOEA) techniques have been proposed and applied to many scientific and engineering applications. Our discussion's intent is to rigorously define and execute a quantitative MOEA performance comparison methodology. Almost all comparisons cited in the current literature visually compare algorithmic results, resulting in only relative conclusions. Our methodology gives a basis for absolute conclusions regarding MOEA performance. Selected results from its execution with four MOEAs are presented and described.

481 citations


BookDOI
20 Nov 2000
TL;DR: The focus is on fitness evaluation, constraint-handling techniques, population structures, advanced techniques in evolutionary computation, and the implementation of evolutionary algorithms.
Abstract: Evolutionary Computation 2: Advanced Algorithms and Operators expands upon the basic ideas underlying evolutionary algorithms. The focus is on fitness evaluation, constraint-handling techniques, population structures, advanced techniques in evolutionary computation, and the implementation of evolutionary algorithms. It is intended to be used by individual researchers and students in the expanding field of evolutionary computation.

462 citations


Journal ArticleDOI
TL;DR: It is argued that by studying evolved designs of gradually increasing scale, one might be able to discern new, efficient, and generalisable principles of design, which explain how to build systems which are too large to evolve.
Abstract: In a previous work it was argued that by studying evolved designs of gradually increasing scale, one might be able to discern new, efficient, and generalisable principles of design. These ideas are tested in the context of designing digital circuits, particularly arithmetic circuits. This process of discovery is seen as a principle extraction loop in which the evolved data is analysed both phenotypically and genotypically by processes of data mining and landscape analysis. The information extracted is then fed back into the evolutionary algorithm to enhance its search capabilities and hence increase the likelihood of identifying new principles which explain how to build systems which are too large to evolve.

405 citations


Proceedings ArticleDOI
16 Jul 2000
TL;DR: A memetic algorithm for tackling multiobjective optimization problems is presented that employs the proven local search strategy used in the Pareto archived evolution strategy (PAES) and combines it with the use of a population and recombination.
Abstract: A memetic algorithm for tackling multiobjective optimization problems is presented. The algorithm employs the proven local search strategy used in the Pareto archived evolution strategy (PAES) and combines it with the use of a population and recombination. Verification of the new M-PAES (memetic PAES) algorithm is carried out by testing it on a set of multiobjective 0/1 knapsack problems. On each problem instance, a comparison is made between the new memetic algorithm, the (1+1)-PAES local searcher, and the strength Pareto evolutionary algorithm (SPEA) of E. Zitzler and L. Thiele (1998, 1999).

377 citations


Book ChapterDOI
TL;DR: A modified XCS classifier system is described that learns a non-linear real-vector classification task.
Abstract: Classifier systems have traditionally taken binary strings as inputs, yet in many real problems such as data inference, the inputs have real components. A modified XCS classifier system is described that learns a non-linear real-vector classification task.

350 citations


Journal ArticleDOI
TL;DR: EP has better recognition performance than PCA (eigenfaces) and better generalization abilities than the Fisher linear discriminant (Fisherfaces).
Abstract: Introduces evolutionary pursuit (EP) as an adaptive representation method for image encoding and classification In analogy to projection pursuit, EP seeks to learn an optimal basis for the dual purpose of data compression and pattern classification It should increase the generalization ability of the learning machine as a result of seeking the trade-off between minimizing the empirical risk encountered during training and narrowing the confidence interval for reducing the guaranteed risk during testing It therefore implements strategies characteristic of GA for searching the space of possible solutions to determine the optimal basis It projects the original data into a lower dimensional whitened principal component analysis (PCA) space Directed random rotations of the basis vectors in this space are searched by GA where evolution is driven by a fitness function defined by performance accuracy (empirical risk) and class separation (confidence interval) Accuracy indicates the extent to which learning has been successful, while separation gives an indication of expected fitness The method has been tested on face recognition using a greedy search algorithm To assess both accuracy and generalization capability, the data includes for each subject images acquired at different times or under different illumination conditions EP has better recognition performance than PCA (eigenfaces) and better generalization abilities than the Fisher linear discriminant (Fisherfaces)

343 citations


Journal ArticleDOI
TL;DR: An algorithm that uses an estimation of the joint distribution of promising solutions in order to generate new candidate solutions and is able to solve all but one of the tested problems in linear or close to linear time with respect to the problem size.
Abstract: This paper proposes an algorithm that uses an estimation of the joint distribution of promising solutions in order to generate new candidate solutions. The algorithm is settled into the context of genetic and evolutionary computation and the algorithms based on the estimation of distributions. The proposed algorithm is called the Bayesian Optimization Algorithm (BOA). To estimate the distribution of promising solutions, the techniques for modeling multivariate data by Bayesian networks are used. The BOA identifies, reproduces, and mixes building blocks up to a specified order. It is independent of the ordering of the variables in strings representing the solutions. Moreover, prior information about the problem can be incorporated into the algorithm, but it is not essential. First experiments were done with additively decomposable problems with both nonoverlapping as well as overlapping building blocks. The proposed algorithm is able to solve all but one of the tested problems in linear or close to linear time with respect to the problem size. Except for the maximal order of interactions to be covered, the algorithm does not use any prior knowledge about the problem. The BOA represents a step toward alleviating the problem of identifying and mixing building blocks correctly to obtain good solutions for problems with very limited domain information.

Proceedings ArticleDOI
22 Jan 2000
TL;DR: A novel immune network model is proposed with the main goals of clustering and filtering unlabelled numerical data sets, and a trade-off between the proposed network and artificial neural networks used to perform unsupervised learning is concluded.
Abstract: This paper explores basic aspects of the immune system and proposes a novel immune network model with the main goals of clustering and filtering unlabelled numerical data sets. It is not our concern to reproduce with confidence any immune phenomenon, but to show that immune concepts can be used to develop powerful computational tools for data processing. As important results of our model, the network evolved will be capable of reducing redundancy, describing data structure, including the shape of the clusters. The network will be implemented in association with a statistical inference technique, and its performance will be illustrated using two benchmark problems. The paper is concluded with a trade-off between the proposed network and artificial neural networks used to perform unsupervised learning.

Journal ArticleDOI
TL;DR: The experiments with several human subjects show that the IGA approach to dress design aid system is promising, and the system is based on a new encoding scheme that practically describes a dress with three parts: body and neck, sleeve, and skirt.

Proceedings ArticleDOI
16 Jul 2000
TL;DR: The most important preference handling approaches used with evolutionary algorithms, analyzing their advantages and disadvantages are reviewed, and then, some of the potential areas of future research in this discipline are proposed.
Abstract: Despite the relatively high volume of research conducted on evolutionary multiobjective optimization in the last few years. Little attention has been paid to the decision making process that is required to select a final solution to the multiobjective optimization problem at hand. This paper reviews the most important preference handling approaches used with evolutionary algorithms, analyzing their advantages and disadvantages, and then, it proposes some of the potential areas of future research in this discipline.

Journal ArticleDOI
TL;DR: A new numerical search algorithm efficient enough to allow full circuit simulation of each circuit candidate, and robust enough to find good solutions for difficult circuits is developed.
Abstract: Analog synthesis tools have traditionally traded quality for speed, substituting simplified circuit evaluation methods for full simulation in order to accelerate the numerical search for solution candidates. As a result, these tools have failed to migrate into mainstream use primarily because of difficulties in reconciling the simplified models required for synthesis with the industrial-strength simulation environments required for validation. We argue that for synthesis to be practical, it is essential to synthesize a circuit using the same simulation environment created to validate the circuit. In this paper, we develop a new numerical search algorithm efficient enough to allow full circuit simulation of each circuit candidate, and robust enough to find good solutions for difficult circuits. The method combines the population-of-solutions ideas from evolutionary algorithms with a novel variant of pattern search, and supports transparent network parallelism. Comparison of several synthesized cell-level circuits against manual industrial designs demonstrates the utility of the approach.

Journal ArticleDOI
TL;DR: This paper examines recent developments in the field of evolutionary computation for manufacturing optimization with a wide range of problems, from job shop and flow shop scheduling, to process planning and assembly line balancing.
Abstract: The use of intelligent techniques in the manufacturing field has been growing the last decades due to the fact that most manufacturing optimization problems are combinatorial and NP hard. This paper examines recent developments in the field of evolutionary computation for manufacturing optimization. Significant papers in various areas are highlighted, and comparisons of results are given wherever data are available. A wide range of problems is covered, from job shop and flow shop scheduling, to process planning and assembly line balancing.

Journal ArticleDOI
TL;DR: A remarkable property of LEM is that it is capable of quantum leaps (“insight jumps”) of the fitness function, unlike Darwinian-type evolution that typically proceeds through numerous slight improvements.
Abstract: A new class of evolutionary computation processes is presented, called Learnable Evolution Model or LEM. In contrast to Darwinian-type evolution that relies on mutation, recombination, and selection operators, LEM employs machine learning to generate new populations. Specifically, in Machine Learning mode, a learning system seeks reasons why certain individuals in a population (or a collection of past populations) are superior to others in performing a designated class of tasks. These reasons, expressed as inductive hypotheses, are used to generate new populations. A remarkable property of LEM is that it is capable of quantum leaps (“insight jumps”) of the fitness function, unlike Darwinian-type evolution that typically proceeds through numerous slight improvements. In our early experimental studies, LEM significantly outperformed evolutionary computation methods used in the experiments, sometimes achieving speed-ups of two or more orders of magnitude in terms of the number of evolutionary steps. LEM has a potential for a wide range of applications, in particular, in such domains as complex optimization or search problems, engineering design, drug design, evolvable hardware, software engineering, economics, data mining, and automatic programming.

Journal ArticleDOI
TL;DR: The field of evolutionary computation is one of the fastest growing areas of computer science and engineering for just this reason; it is addressing many problems that were previously beyond reach, such as rapid design of medicines, flexible solutions to supply-chain management problems, and rapid analysis of battlefield tactics for defense as mentioned in this paper.
Abstract: Taking a page from Darwin's 'On the origin of the species', computer scientists have found ways to evolve solutions to complex problems. Harnessing the evolutionary process within a computer provides a means for addressing complex engineering problems-ones involving chaotic disturbances, randomness, and complex nonlinear dynamics-that traditional algorithms have been unable to conquer. Indeed, the field of evolutionary computation is one of the fastest growing areas of computer science and engineering for just this reason; it is addressing many problems that were previously beyond reach, such as rapid design of medicines, flexible solutions to supply-chain management problems, and rapid analysis of battlefield tactics for defense. Potentially, the field may fulfil the dream of artificial intelligence: a computer that can learn on its own and become an expert in any chosen area.

Proceedings ArticleDOI
16 Jul 2000
TL;DR: Four abstract evolutionary algorithms for multi-objective optimization and theoretical results that characterize their convergence behavior are presented and it is easy to verify whether or not a particular instantiation of these abstract evolutionary algorithm offers the desired limit behavior.
Abstract: We present four abstract evolutionary algorithms for multi-objective optimization and theoretical results that characterize their convergence behavior. Thanks to these results it is easy to verify whether or not a particular instantiation of these abstract evolutionary algorithms offers the desired limit behavior. Several examples are given.

Journal ArticleDOI
TL;DR: A new extension of EP/N computes a safe-optimum path of a ship in given static and dynamic environments, and a safe trajectory of the ship in a collision situation is determined on the basis of this algorithm.
Abstract: For a given circumstance (i.e., a collision situation at sea), a decision support system for navigation should help the operator to choose a proper manoeuvre, teach him good habits, and enhance his general intuition on how to behave in similar situations in the future. By taking into account certain boundaries of the maneuvering region along with information on navigation obstacles and other moving ships, the problem of avoiding collisions is reduced to a dynamic optimization task with static and dynamic constraints. This paper presents experiments with a modified version of the Evolutionary Planner/Navigator (EP/N). Its new version, /spl thetav/EP/N++, is a major component of a such decision support system. This new extension of EP/N computes a safe-optimum path of a ship in given static and dynamic environments. A safe trajectory of the ship in a collision situation is determined on the basis of this algorithm. The introduction of a time parameter, the variable speed of the ship, and time-varying constraints representing movable ships are the main features of the new system. Sample results of ship trajectories obtained for typical navigation situations are presented.

01 Jan 2000
TL;DR: This paper describes a new approach to music composition, more precisely the composition of rhythms, by means of IEC, to combine Genetic Algorithms and Genetic Programming, which can generate attractive musical rhythms.
Abstract: Interactive Evolutionary Computation (IEC), i.e., Evolutionary Computation whose fitness function is provided by a user his/herself, has been applied to esthetic areas, such as art, design and music. We cannot necessarily define fitness functions explicitly in these areas. With IEC, however, we can embed the user's implicit preference into the optimization system. This paper describes a new approach to music composition, more precisely the composition of rhythms, by means of IEC. The main feature of our method is to combine Genetic Algorithms (GA) and Genetic Programming (GP). In our system, GA individuals represent short pieces of rhythmic patterns, while GP individuals express how these patterns are arranged in terms of their functions. Both populations are evolved interactively through the user's evaluation. The integration of interactive GA and GP makes it possible to search for musical structures effectively in the vast search space. In this paper, we show how successfully our proposed method can generate attractive musical rhythms. The effectiveness of our system is demonstrated by the evolved rhythm phrases, which are available from our web site as sound files.

Proceedings ArticleDOI
08 Oct 2000
TL;DR: A new evolutionary system for evolving artificial feedforward neural networks, which is based on the particle swarm optimisation (PSO) algorithm, which shows that ANNs evolved by PSONN have good accuracy and generalisation ability.
Abstract: The information processing capability of artificial neural networks (ANNs) is closely related to its architecture and weights. The paper describes a new evolutionary system for evolving artificial feedforward neural networks, which is based on the particle swarm optimisation (PSO) algorithm. Both the architecture and the weights of ANNs are adaptively adjusted according to the quality of the neural network. This process is repeated until the best ANN is accepted or the maximum number of generations has been reached. A strategy of evolving added nodes and a partial training algorithm are used to maintain a close behavioural link between the parents and their offspring. This system has been tested on two real problems in the medical domain. The results show that ANNs evolved by PSONN have good accuracy and generalisation ability.

Proceedings ArticleDOI
16 Jul 2000
TL;DR: A unified model of multi-objective evolutionary algorithms, in which arbitrary variation and selection operators can be combined as building blocks, including archiving and re-insertion strategies is presented.
Abstract: Though it has been claimed that elitism could improve evolutionary multi-objective search significantly, a thorough and extensive evaluation of its effects is still missing. Guidelines on how elitism could successfully be incorporated have not yet been developed. This paper presents a unified model of multi-objective evolutionary algorithms, in which arbitrary variation and selection operators can be combined as building blocks, including archiving and re-insertion strategies. The presented model enables most specific multi-objective (evolutionary) algorithm to be formulated as an instance of it, which will be demonstrated by simple examples. We further show how elitism can be quantified by the model's parameters and how this allows an easy evaluation of the effect of elitism on different algorithms.

Proceedings Article
10 Jul 2000
TL;DR: It is found that the evolutionary algorithm will converge incorrectly if the approximate model has false optima, so two strategies to control the evolution process are introduced and methods to eliminate false minima in neural network training are proposed.
Abstract: The evaluation of the quality of solutions is usually very time-consuming in design optimization. Therefore, time-efficient approximate models can be particularly beneficial for the evaluation when evolutionary algorithms are applied. In this paper, the convergence property of an evolution strategy (ES) with neural network based fitness evaluations is investigated. It is found that the evolutionary algorithm will converge incorrectly if the approximate model has false optima. To address this problem, two strategies to control the evolution process are introduced. In addition, methods to eliminate false minima in neural network training are proposed. The effectiveness of the methods are shown with simulation studies on the Ackley function and the Rosenbrock function.

Book
01 Aug 2000
TL;DR: This book provides a characterization of the roles that recombination and mutation play in evolutionary algorithms and introduces new theoretical techniques for studying evolutionary algorithms.
Abstract: Despite decades of work in evolutionary algorithms, there remains an uncertainty as to the relative benefits and detriments of using recombination or mutation. This book provides a characterization of the roles that recombination and mutation play in evolutionary algorithms. It integrates important prior work and introduces new theoretical techniques for studying evolutionary algorithms. Consequences of the theory are explored and a novel method for comparing search and optimization algorithms is introduced. The focus allows the book to bridge multiple communities, including evolutionary biologists and population geneticists.

Journal ArticleDOI
TL;DR: This paper generalizes the reference classes of fitness distance correlation and epistasis variance, and constructs a new predictive measure that is insensitive to nonlinear fitness scaling, and investigates the relations between the reference Classes of the measures and a number of intuitively easy classes.
Abstract: This paper studies a number of predictive measures of problem difficulty, among which epistasis variance and fitness distance correlation are the most widely known. Our approach is based on comparing the reference class of a measure to a number of known easy function classes. First, we generalize the reference classes of fitness distance correlation and epistasis variance, and construct a new predictive measure that is insensitive to nonlinear fitness scaling. We then investigate the relations between the reference classes of the measures and a number of intuitively easy classes. We also point out the need to further identify which functions are easy for a given class of evolutionary algorithms in order to design more efficient hardness indicators for them. We finally restrict attention to the genetic algorithm (GA), and consider both GA-easy and GA-hard fitness functions, and give experimental evidence that the values of the measures, based on random samples, can be completely unreliable and entirely uncorrelated to the convergence quality and convergence speed of GA instances using either proportional or ranking selection.

Proceedings ArticleDOI
16 Jul 2000
TL;DR: A novel commonality-based crossover operator is introduced and placed in the multiobjective evolutionary setting, which helps to preserve building blocks with promising performance in feature selection.
Abstract: Feature selection is a common and key problem in many classification and regression tasks. It can be viewed as a multiobjective optimisation problem, since, in the simplest case, it involves feature subset size minimisation and performance maximisation. This paper presents a multiobjective evolutionary approach for feature selection. A novel commonality-based crossover operator is introduced and placed in the multiobjective evolutionary setting. This specialised operator helps to preserve building blocks with promising performance. Selection bias reduction is achieved by resampling. We argue that this is a generic approach, which can be used in many modelling problems. It is applied to feature selection on different neural network architectures. Results from experiments with benchmarking data sets are given.

Book ChapterDOI
18 Sep 2000
TL;DR: The framework named IDEA is introduced to formalize the notion that the estimation of densities over selected samples and the sampling from the resulting distributions is a combination of the recombination and mutation steps used in evolutionary algorithms.
Abstract: The direct application of statistics to stochastic optimization based on iterated density estimation has become more important and present in evolutionary computation over the last few years. The estimation of densities over selected samples and the sampling from the resulting distributions, is a combination of the recombination and mutation steps used in evolutionary algorithms. We introduce the framework named IDEA to formalize this notion. By combining continuous probability theory with techniques from existing algorithms, this framework allows us to define new continuous evolutionary optimization algorithms.

Journal ArticleDOI
TL;DR: An overview of evolutionary computation as applied to problems in the medical domains is provided, outlining the basic workings of six types of evolutionary algorithms: genetic algorithms, genetic programming, evolution strategies, evolutionary programming, classifier systems, and hybrid systems.

Proceedings ArticleDOI
24 Apr 2000
TL;DR: An evolutionary algorithm is used to evolve gaits with the Sony entertainment robot, AIBO, and by sculpting the experimental environment, the robustness to different surface types and different AIBOs is increased.
Abstract: An evolutionary algorithm is used to evolve gaits with the Sony entertainment robot, AIBO. All processing is handled by the robot's on-board computer with individuals evaluated using the robot's hardware. By sculpting the experimental environment, we increase the robustness to different surface types and different AIBOs. Evolved gaits are faster than those created by hand. Using this technique we evolve a gait used in the consumer version of AIBO.