scispace - formally typeset
Search or ask a question

Showing papers on "Evolutionary programming published in 1994"


Journal ArticleDOI
TL;DR: The development of each of these procedures over the past 35 years is described and some recent efforts in these areas are reviewed.
Abstract: Natural evolution is a population-based optimization process. Simulating this process on a computer results in stochastic optimization techniques that can often outperform classical methods of optimization when applied to difficult real-world problems. There are currently three main avenues of research in simulated evolution: genetic algorithms, evolution strategies, and evolutionary programming. Each method emphasizes a different facet of natural evolution. Genetic algorithms stress chromosomal operators. Evolution strategies emphasize behavioral changes at the level of the individual. Evolutionary programming stresses behavioral change at the level of the species. The development of each of these procedures over the past 35 years is described. Some recent efforts in these areas are reviewed. >

1,549 citations


Journal ArticleDOI
TL;DR: It is argued that genetic algorithms are inappropriate for network acquisition and an evolutionary program is described, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks.
Abstract: Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods. >

1,092 citations


Book ChapterDOI
11 Apr 1994
TL;DR: The EP selection model is shown to be equivalent to an ES model in one form, and surprisingly similar to fitness proportionate selection in another, as well as being remarkably immune to evaluation noise, models that retain parents much less so.
Abstract: Selection methods in Evolutionary Algorithms, including Genetic Algorithms, Evolution Strategies (ES) and Evolutionary Programming, (EP) are compared by observing the rate of convergence on three idealised problems. The first considers selection only, the second introduces mutation as a source of variation, the third also adds in evaluation noise. Fitness proportionate selection suffers from scaling problems: a number of techniques to reduce these are illustrated. The sampling errors caused by roulette wheel and tournament selection are demonstrated. The EP selection model is shown to be equivalent to an ES model in one form, and surprisingly similar to fitness proportionate selection in another. Generational models are shown to be remarkably immune to evaluation noise, models that retain parents much less so.

149 citations


Book ChapterDOI
09 Oct 1994
TL;DR: This paper is devoted to the construction of a mutation distribution for unbounded integer search spaces using the principle of maximum entropy to select a specific distribution from numerous potential candidates.
Abstract: The mutation distribution of evolutionary algorithms usually is oriented at the type of the search space. Typical examples are binomial distributions for binary strings in genetic algorithms or normal distributions for real valued vectors in evolution strategies and evolutionary programming. This paper is devoted to the construction of a mutation distribution for unbounded integer search spaces. The principle of maximum entropy is used to select a specific distribution from numerous potential candidates. The resulting evolutionary algorithm is tested for five nonlinear integer problems.

138 citations



Journal ArticleDOI
TL;DR: It is speculated that the stochastic training method implemented in this study for training recurrent perceptrons can be used to train perceptron networks that have radically recurrent architectures.
Abstract: Evolutionary programming, a systematic multi-agent stochastic search technique, is used to generate recurrent perceptrons (nonlinear IIR filters). A hybrid optimization scheme is proposed that embeds a single-agent stochastic search technique, the method of Solis and Wets, into the evolutionary programming paradigm. The proposed hybrid optimization approach is further augmented by "blending" randomly selected parent vectors to create additional offspring. The first part of this work investigates the performance of the suggested hybrid stochastic search method. After demonstration on the Bohachevsky and Rosenbrock response surfaces, the hybrid stochastic optimization approach is applied in determining both the model order and the coefficients of recurrent perceptron time-series models. An information criterion is used to evaluate each recurrent perceptron structure as a candidate solution. It is speculated that the stochastic training method implemented in this study for training recurrent perceptrons can be used to train perceptron networks that have radically recurrent architectures. >

130 citations


Journal ArticleDOI
TL;DR: The basic convergence properties of evolutionary optimization algorithms are investigated and it is indicated that the methods studied will asymptotically converge to global optima and genetic algorithms may prematurely stagnate at solutions that may not even be locally optimal.
Abstract: The basic convergence properties of evolutionary optimization algorithms are investigated. Analysis indicates that the methods studied will asymptotically converge to global optima. The results also indicate that genetic algorithms may prematurely stagnate at solutions that may not even be locally optimal. Function optimization experiments are conducted that illustrate the mathematical properties. Evolutionary programming is seen to outperform genetic algorithms in searching two response surfaces that do not possess local optima. The results are statistically significant.

117 citations


Journal ArticleDOI
TL;DR: This work has suggested that genetic models using recombination operators, specifically crossover, are typically more efficient and effective at function optimization than behavioral models that rely solely on mutation.
Abstract: There has been renewed interest in using simulated evolution to address difficult optimization problems. These simulations can be divided into two groups: (1) those that model chromosomes and emphasize genetic operators; and (2) those that model individuals or populations and emphasize the adaptation and diversity of behavior. Recent claims have suggested that genetic models using recombination operators, specifically crossover, are typically more efficient and effective at function optimization than behavioral models that rely solely on mutation. These claims are assessed empirically on a broad range of response surfaces.

103 citations


Book ChapterDOI
09 Oct 1994
TL;DR: A class of specialised mutation operators for use in conjunction with the commonly employed penalty function based EA approach to timetabling which shows significant improvement in performance over a range of real and realistic problems.
Abstract: Researchers are turning more and more to evolutionary algorithms (EAs) as a flexible and effective technique for addressing timetabling problems in their institutions. We present a class of specialised mutation operators for use in conjunction with the commonly employed penalty function based EA approach to timetabling which shows significant improvement in performance over a range of real and realistic problems. We also discuss the use of delta evaluation, an obvious and recommended technique which speeds up the implementation of the approach, and leads to a more pertinent measure of speed than the commonly used ‘number of evaluations’. A suite of realistically difficult benchmark timetabling problems is described and made available for use in comparative research.

95 citations


01 Jan 1994
TL;DR: An analysis is given of a model of genetic programming dynamics that is supportive of the “Soft Brood Selection” conjecture, which was proposed as a means to counteract the emergence of highly conservative code, and instead favor highly evolvable code.
Abstract: Evolutionary computation systems exhibit various emergent phenomena, primary of which is adaptation. In genetic programming, because of the indeterminate nature of the representation, the evolution of both recombination distributions and representations can emerge from the population dynamics. A review of ideas on these phenomena is presented, including theory on the evolution of evolvability through differential proliferation of subexpressions within programs. An analysis is given of a model of genetic programming dynamics that is supportive of the “Soft Brood Selection” conjecture, which was proposed as a means to counteract the emergence of highly conservative code, and instead favor highly evolvable code.

79 citations


Journal ArticleDOI
TL;DR: A construction of a new hybrid optimization system, Genocop II, is discussed and its experimental results on a few test cases (nonlinear programming problems) are presented.

Journal ArticleDOI
TL;DR: Experiments indicate that evolutionary programming generally outperforms genetic algorithms and the General Algebraic Modeling System, a numerical optimization software package.
Abstract: Evolutionary programming is a stochastic optimization procedure that can be applied to difficult combinatorial problems. Experiments are conducted with three standard optimal control problems (linear-quadratic, harvest, and push-cart). The results are compared to those obtained with genetic algorithms and the General Algebraic Modeling System (GAMS), a numerical optimization software package. The results indicate that evolutionary programming generally outperforms genetic algorithms. Evolutionary programming also compares well with GAMS on certain problems for which GAMS is specifically designed and outperforms GAMS on other problems. The computational requirements for each procedure are briefly discussed.


Book ChapterDOI
09 Oct 1994
TL;DR: This paper extends the conventional representation of Genetic Algorithms by using automata in order to allow the adaptation of the crossover operator probability as the run progresses in to facilitate schema identification and reduce schema disruption.
Abstract: Genetic Algorithms (GAs) have traditionally required the specification of a number of parameters that control the evolutionary process. In the classical model, the mutation and crossover operator probabilities are specified before the start of a GA run and remain unchanged; a so-called static model. This paper extends the conventional representation by using automata in order to allow the adaptation of the crossover operator probability as the run progresses in order to facilitate schema identification and reduce schema disruption. Favourable results have been achieved for a wide range of function minimization problems and these are described.

Journal ArticleDOI
TL;DR: The use of evolutionary programming for computer-aided design and testing of neural controllers applied to problems in which the system to be controlled is highly uncertain.
Abstract: This paper discusses the use of evolutionary programming (EP) for computer-aided design and testing of neural controllers applied to problems in which the system to be controlled is highly uncertain. Examples include closed-loop control of drug infusion and integrated control of HVAC/lighting/utility systems in large multi-use buildings. The method is described in detail and applied to a modified Cerebellar Model Arithmetic Computer (CMAC) neural network regulator for systems with unknown time delays. The design and testing problem is viewed as a game, in that the controller is chosen with a minimax criterion i.e., minimize the loss associated with its use on the worst possible plant. The technique permits analysis of neural strategies against a set of feasible plants. This yields both the best choice of control parameters and identification of that plant which is most difficult for the best controller to handle. >

Journal ArticleDOI
TL;DR: Results presented show the success of evolutionary programming in solving an example of a fractal inverse problem, but indicate that a genetic algorithm is not as successful.
Abstract: Over the past 30 years, algorithms that model natural evolution have generated robust search methods. These so-called evolutionary algorithms have been successfully applied to a wide range of problems. This paper discusses two types of evolutionary algorithms and their application to a problem in shape representation. Genetic algorithms and evolutionary programming, although both based on evolutionary principles, each place different emphasis on what drives the evolutionary process. While genetic algorithms rely on mimicking specific genotypic transformations, evolutionary programming emphasizes phenotypic adaptation. Results presented show the success of evolutionary programming in solving an example of a fractal inverse problem, but indicate that a genetic algorithm is not as successful. Reasons for this disparity are discussed.

Proceedings ArticleDOI
27 Jun 1994
TL;DR: The results indicate the suitability for using EP to evolve neurocontrollers for these two systems and the objective is to bring the systems into balance.
Abstract: Evolutionary programming (EP) is a stochastic optimization technique that can be used to train neural networks. Unlike many training algorithms, EP does not require gradient information, and this facet increases the applicability of the procedure. The current investigation focuses on evolving neurocontrollers for two difficult nonlinear unstable systems. In the first, two separate poles of varying length are mounted on a cart. In the second, two jointed poles of varying length are mounted on a cart. The objective is to bring the systems into balance. The results indicate the suitability for using EP to evolve neurocontrollers for these two systems. >

Journal ArticleDOI
TL;DR: The learning algorithm machinery for finding optimal representations is independent of the definition of optimality, and thus provides a general tool useful in a wide variety of contexts.
Abstract: We have developed a procedure for finding optimal representations of experimental data. Criteria for optimality vary according to context; an optimal state space representation will be one that best suits one’s stated goal for reconstruction. We consider an ∞-dimensional set of possible reconstruction coordinate systems that include time delays, derivatives, and many other possible coordinates; and any optimality criterion is specified as a real valued functional on this space. We present a method for finding the optima using a learning algorithm based upon the genetic algorithm and evolutionary programming. The learning algorithm machinery for finding optimal representations is independent of the definition of optimality, and thus provides a general tool useful in a wide variety of contexts.

Journal ArticleDOI
TL;DR: This paper reviews some of the early development of the method and focuses on three current avenues of research: pattern discovery, system identification and automatic control.
Abstract: Evolutionary programming was originally proposed in 1962 as an alternative method for generating machine intelligence. This paper reviews some of the early development of the method and focuses on three current avenues of research: pattern discovery, system identification and automatic control. Recent efforts along these lines are described. In addition, the application of evolutionary algorithms to autonomous system design on parallel processing computers is briefly discussed.

Book ChapterDOI
09 Oct 1994
TL;DR: An appropriate GA representation scheme is developed, called EP-I for Evolutionary Programming with Introns, which demonstrates that the changes introduced by GP do not have any properties beyond those of a canonical GA for program induction.
Abstract: This paper studies Genetic Programming (GP) and its relation to the Genetic Algorithm (GA). GP uses a GA approach to breed successive populations of programs, represented in the chromosomes as parse trees, until a program that solves the problem emerges. However, parse trees are not naturally homologous, consequently changes had to be introduced into GP. To better understand these changes it would be instructive if a canonical GA could also be used to perform program induction. To this end an appropriate GA representation scheme is developed (called EP-I for Evolutionary Programming with Introns). EP-I has been tested on three problems and performed identically to GP, thus demonstrating that the changes introduced by GP do not have any properties beyond those of a canonical GA for program induction. EP-I is also able to simulate GP exactly thus gaining further insights into the nature of GP as a GA.

Proceedings Article
John R. Koza1
01 Jan 1994
TL;DR: The genetically evolved program is an instance of an algorithm discovered by an automated learning paradigm that is superior to that written by human investigators.
Abstract: The recently-developed genetic programming paradigm is used to evolve a computer program to classify a given protein segment as being a transmembrane domain or non-transmembrane area of the protein. Genetic programming starts with a primordial ooze of randomly generated computer programs composed of available programmatic ingredients and then genetically breeds the population of programs using the Darwinian principle of survival of the fittest and an analog of the naturally occurring genetic operation of crossover (sexual recombination). Automatic function definition enables genetic programming to dynamically create subroutines dynamically during the run. Genetic programming is given a training set of differently-sized protein segments and their correct classification (but no biochemical knowledge, such as hydrophobicity values). Correlation is used as the fitness measure to drive the evolutionary process. The best genetically-evolved program achieves an out-of-sample correlation of 0.968 and an out-of-sample error rate of 1.6%. This error rate is better than that reported for four other algorithms reported at the First International Conference on Intelligent Systems for Molecular Biology. Our genetically evolved program is an instance of an algorithm discovered by an automated learning paradigm that is superior to that written by human investigators.

Proceedings ArticleDOI
27 Jun 1994
TL;DR: Modal mutation schemes for evolutionary algorithms as a generalization of the breeder genetic algorithm mutation scheme are introduced and analyzed for multimodal continuous parameter optimization problems.
Abstract: With this paper modal mutation schemes for evolutionary algorithms as a generalization of the breeder genetic algorithm mutation scheme are introduced and analyzed for multimodal continuous parameter optimization problems. A new scaling rule for multiple mutations is formalized and compared with a new step-size scaling for evolution strategies. A performance comparison of the multivalued evolutionary algorithm with modal mutations with recently published results concerning the performance of Bayesian/sampling and very fast simulated reannealing techniques for global optimization is given. >

Journal Article
TL;DR: It is proposed that evolutionary computation be understood more like computational evolution, i.e., to use the evolutionary process for construction, rather than for optimization, and to construct intelligence as adaptive behavior based on artiicial neural networks.
Abstract: This paper proposes that evolutionary computation be understood more like computational evolution, i.e., to use the evolutionary process for construction , rather than for optimization. For this purpose simulation of the evolutionary process should include a non-linear developmental process from genotype to phenotype. In this developmental process the environment has an important role. In order to model the developmental process under the in-uence of the environment, a new modeling language is introduced. The focus of this language is on the interactions, which are considered to be the basic elements for environmental adaptation. The developed modeling method provides a complete simulation environment for the construction of organisms. The developed system aims to construct intelligence as adaptive behavior based on artiicial neural networks.


Proceedings ArticleDOI
01 Jan 1994
TL;DR: Three distinct computational intelligence paradigms combine to support the task of process monitoring and optimization, which are neural-net computing, evolutionary programming and fuzzy-logic.
Abstract: Three distinct computational intelligence paradigms combine to support the task of process monitoring and optimization. These are neural-net computing, evolutionary programming and fuzzy-logic. We describe briefly some of our contributions to these paradigms and outline how they function in process monitoring and optimization. Four different types of monitoring tasks are considered. This type of combined computational intelligence is being applied successfully to optimal process planning in electric power utilities. Examples of these include heat rate improvement and NO/sub x/ minimization at some Western Pennsylvania and Western New York State utilities. >

Journal ArticleDOI
01 Aug 1994
TL;DR: Using principles from Complexity Science, key features of system architecture for an evolutionary system development are defined and may provide the adaptive flexibility observed in naturally occurring systems.
Abstract: Evolutionary system development is an alternative to the waterfall development model. An ongoing system architecture activity is a consistent feature of evolutionary development models. Using principles from Complexity Science, key features of system architecture for an evolutionary system development are defined. These principles may provide the adaptive flexibility observed in naturally occurring systems.


Proceedings ArticleDOI
21 Mar 1994
TL;DR: This paper presents a new approach of using evolutionary programming (EP) to estimate the synchronous generator parameters by contrast to the extended Kalman filter which is used currently for parameter estimation.
Abstract: This paper presents a new approach of using evolutionary programming (EP) to estimate the synchronous generator parameters. The EP is very powerful to search the real values of the parameters with measurements of generator outputs which are highly contaminated by noises, by contrast to the extended Kalman filter which is used currently for parameter estimation. Comparison between the two different methodologies is given in the paper to show the potential of applications of the EP to parameter estimation and system modelling. >

Jörn Hopf1
01 Jan 1994
TL;DR: In this article, the response to selection equation and the concept of heritability can be applied to predict the behavior of the BGA under the assumption of additive gene effects, and advanced statistical techniques for estimating the heritability are used to analyze and control the BIA.
Abstract: The Breeder Genetic Algorithm BGA models artificial selection as performed by human breeders. We show how the response to selection equation and the concept of heritability can be applied to predict the behavior of the BGA. The theoretical results are obtained under the assumption of additive gene effects. For general fitness landscapes advanced statistical techniques for estimating the heritability are used to analyze and control the BGA.

Dissertation
01 Jan 1994
TL;DR: This thesis examines a hybrid symbolic/subsymbolic approach and the application of evolutionary algorithms to a problem from each of the fields of shape representation, natural language dialogue, and speech recognition, where EP appears to offer a powerful means of finding solutions to this problem.
Abstract: For many years research in artificial intelligence followed a symbolic paradigm which required a level of knowledge described in terms of rules. More recently subsymbolic approaches have been adopted as a suitable means for studying many problems. There are many search mechanisms which can be used to manipulate subsymbolic components, and in recent years general search methods based on models of natural evolution have become increasingly popular. This thesis examines a hybrid symbolic/subsymbolic approach and the application of evolutionary algorithms to a problem from each of the fields of shape representation (finding an iterated function system for an arbitrary shape), natural language dialogue (tuning parameters so that a particular behaviour can be achieved) and speech recognition (selecting the penalties used by a dynamic programming algorithm in creating a word lattice). These problems were selected on the basis that each should have a fundamentally different interactions at the subsymbolic level. Results demonstrate that for the experiments conducted the evolutionary algorithms performed well in most cases. However, the type of subsymbolic interaction that may occur influences the relative performance of evolutionary algorithms which emphasise either top-down (evolutionary programming - EP) or bottom-up (genetic algorithm - GA) means of solution discovery. For the shape representation problem EP is seen to perform significantly better than a GA, and reasons for this disparity are discussed. Furthermore, EP appears to offer a powerful means of finding solutions to this problem, and so the background and details of the problem are discussed at length. Some novel constraints on the problem's search space are also presented which could be used in related work. For the dialogue and speech recognition problems a GA and EP produce good results with EP performing slightly better. Results achieved with EP have been used to improve the performance of a speech recognition system.