scispace - formally typeset
Search or ask a question

Showing papers on "Genetic algorithm published in 1999"


Journal ArticleDOI
TL;DR: The problem features that may cause a multi-objective genetic algorithm (GA) difficulty in converging to the true Pareto-optimal front are studied to enable researchers to test their algorithms for specific aspects of multi- objective optimization.
Abstract: In this paper, we study the problem features that may cause a multi-objective genetic algorithm (GA) difficulty in converging to the true Pareto-optimal front. Identification of such features helps us develop difficult test problems for multi-objective optimization. Multi-objective test problems are constructed from single-objective optimization problems, thereby allowing known difficult features of single-objective problems (such as multi-modality, isolation, or deception) to be directly transferred to the corresponding multi-objective problem. In addition, test problems having features specific to multi-objective optimization are also constructed. More importantly, these difficult test problems will enable researchers to test their algorithms for specific aspects of multi-objective optimization.

1,439 citations


Journal ArticleDOI
TL;DR: A taxonomy that classifies 27 scheduling algorithms and their functionalities into different categories is proposed, with each algorithm explained through an easy-to-understand description followed by an illustrative example to demonstrate its operation.
Abstract: Static scheduling of a program represented by a directed task graph on a multiprocessor system to minimize the program completion time is a well-known problem in parallel processing. Since finding an optimal schedule is an NP-complete problem in general, researchers have resorted to devising efficient heuristics. A plethora of heuristics have been proposed based on a wide spectrum of techniques, including branch-and-bound, integer-programming, searching, graph-theory, randomization, genetic algorithms, and evolutionary methods. The objective of this survey is to describe various scheduling algorithms and their functionalities in a contrasting fashion as well as examine their relative merits in terms of performance and time-complexity. Since these algorithms are based on diverse assumptions, they differ in their functionalities, and hence are difficult to describe in a unified context. We propose a taxonomy that classifies these algorithms into different categories. We consider 27 scheduling algorithms, with each algorithm explained through an easy-to-understand description followed by an illustrative example to demonstrate its operation. We also outline some of the novel and promising optimization approaches and current research trends in the area. Finally, we give an overview of the software tools that provide scheduling/mapping functionalities.

1,373 citations


Journal ArticleDOI
01 Jun 1999
TL;DR: A novel hybrid genetic algorithm that finds a globally optimal partition of a given data into a specified number of clusters using a classical gradient descent algorithm used in clustering, viz.
Abstract: In this paper, we propose a novel hybrid genetic algorithm (GA) that finds a globally optimal partition of a given data into a specified number of clusters. GA's used earlier in clustering employ either an expensive crossover operator to generate valid child chromosomes from parent chromosomes or a costly fitness function or both. To circumvent these expensive operations, we hybridize GA with a classical gradient descent algorithm used in clustering, viz. K-means algorithm. Hence, the name genetic K-means algorithm (GKA). We define K-means operator, one-step of K-means algorithm, and use it in GKA as a search operator instead of crossover. We also define a biased mutation operator specific to clustering called distance-based-mutation. Using finite Markov chain theory, we prove that the GKA converges to the global optimum. It is observed in the simulations that GKA converges to the best known optimum corresponding to the given data in concurrence with the convergence result. It is also observed that GKA searches faster than some of the other evolutionary algorithms used for clustering.

1,326 citations


Proceedings Article
13 Jul 1999
TL;DR: Preliminary experiments show that the BOA outperforms the simple genetic algorithm even on decomposable functions with tight building blocks as a problem size grows.
Abstract: In this paper, an algorithm based on the concepts of genetic algorithms that uses an estimation of a probability distribution of promising solutions in order to generate new candidate solutions is proposed. To estimate the distribution, techniques for modeling multivariate data by Bayesian networks are used. The proposed algorithm identifies, reproduces and mixes building blocks up to a specified order. It is independent of the ordering of the variables in the strings representing the solutions. Moreover, prior information about the problem can be incorporated into the algorithm. However, prior information is not essential. Preliminary experiments show that the BOA outperforms the simple genetic algorithm even on decomposable functions with tight building blocks as a problem size grows.

1,073 citations


Journal ArticleDOI
TL;DR: The compact genetic algorithm (cGA) is introduced which represents the population as a probability distribution over the set of solutions and is operationally equivalent to the order-one behavior of the simple GA with uniform crossover.
Abstract: Introduces the compact genetic algorithm (cGA) which represents the population as a probability distribution over the set of solutions and is operationally equivalent to the order-one behavior of the simple GA with uniform crossover. It processes each gene independently and requires less memory than the simple GA. The development of the compact GA is guided by a proper understanding of the role of the GA's parameters and operators. The paper clearly illustrates the mapping of the simple GA's parameters into those of an equivalent compact GA. Computer simulations compare both algorithms in terms of solution quality and speed. Finally, this work raises important questions about the use of information in a genetic algorithm, and its ramifications show us a direction that can lead to the design of more efficient GAs.

1,049 citations


Book
18 Aug 1999
TL;DR: Although Michael D. Vose describes the SGA in terms of heuristic search, the book is not about search or optimization perse.
Abstract: From the Publisher: The Simple Genetic Algorithm (SGA) is a classical form of genetic search. Viewing the SGA as a mathematical object, Michael D. Vose provides an introduction to what is known (i.e., proven) about the theory of the SGA. He also makes available algorithms for the computation of mathematical objects related to the SGA.. "Although he describes the SGA in terms of heuristic search, the book is not about search or optimization perse. Rather, the focus is on the SGA as an evolutionary system.

828 citations


Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo direct search method is used to estimate the information in the available ensemble to guide a resampling of the parameters of the model space, which can be used to obtain measures of resolution and trade-off in the model parameters.
Abstract: SUMMARY Monte Carlo direct search methods, such as genetic algorithms, simulated annealing etc., areoften used to explore a finite dimensional parameter space. They require th e solving of theforward problem many times, that is, making predictions of observables from an earth model.The resulting ensemble of earth models represents all ‘information’ collected in the searchprocess. Search techniques have been the subject of much study in geophysics; less attention isgiven to the appraisal of the ensemble. Often inferences are based on only a small subset of theensemble, and sometimes a single member.This paper presents a new approach to the appraisal problem. To our knowledge this is the firsttime the general case has been addressed, that is, how to infer information from a completeensemble, previously generated by any search method. The essence of the new approach is touse the informationin the available ensembleto guidea resamplingofthe parameterspace. Thisrequires no further solving of the forward problem, but from the new ‘resampled’ ensemblewe are able to obtain measures of resolution and trade-off in the model parameters, or anycombinations of them.The new ensemble inference algorithm is illustrated on a highly non-linear waveform inversionproblem.It is shownhow the computationtime and memoryrequirements scale with the dimen-sion of the parameter space and size of the ensemble. The method is highly parallel, and mayeasily be distributed across several computers. Since little is assumed about the initial ensembleof earth models, the technique is applicable to a wide variety of situations. For example, it maybe applied to perform ‘error analysis’ using the ensemble generated by a genetic algorithm, orany other direct search method.Key words: numerical techniques, receiver functions, waveform inversion.

817 citations


Proceedings ArticleDOI
06 Jul 1999
TL;DR: A new way to explore the benefits of a memory while minimizing its negative side effects is derived from a number of approaches that extend the evolutionary algorithm with implicit or explicit memory.
Abstract: Recently, there has been increased interest in evolutionary computation applied to changing optimization problems. The paper surveys a number of approaches that extend the evolutionary algorithm with implicit or explicit memory, suggests a new benchmark problem and examines under which circumstances a memory may be helpful. From these observations, we derive a new way to explore the benefits of a memory while minimizing its negative side effects.

750 citations


Journal ArticleDOI
TL;DR: In this article, a hybrid ant colony system coupled with a local search is applied to the quadratic assignment problem, which uses pheromone trail information to perform modifications on QAP solutions.
Abstract: This paper presents HAS–QAP, a hybrid ant colony system coupled with a local search, applied to the quadratic assignment problem. HAS–QAP uses pheromone trail information to perform modifications on QAP solutions, unlike more traditional ant systems that use pheromone trail information to construct complete solutions. HAS–QAP is analysed and compared with some of the best heuristics available for the QAP: two versions of tabu search, namely, robust and reactive tabu search, hybrid genetic algorithm, and a simulated annealing method. Experimental results show that HAS–QAP and the hybrid genetic algorithm perform best on real world, irregular and structured problems due to their ability to find the structure of good solutions, while HAS–QAP performance is less competitive on random, regular and unstructured problems.

710 citations


Book
25 Feb 1999
TL;DR: Genetic Algorithms in Speech Recognition Systems.- 8.2.1 Background of Speech Recogning Systems.
Abstract: 1. Introduction, Background and Biological Inspiration.- 1.1 Biological Background.- 1.1.1 Coding of DNA.- 1.1.2 Flow of Genetic Information.- 1.1.3 Recombination.- 1.1.4 Mutation.- 1.2 Conventional Genetic Algorithm.- 1.3 Theory and Hypothesis.- 1.3.1 Schema Theory.- 1.3.2 Building Block Hypothesis.- 1.4 A Simple Example.- 2. Modifications to Genetic Algorithms.- 2.1 Chromosome Representation.- 2.2 Objective and Fitness Functions.- 2.2.1 Linear Scaling.- 2.2.2 Sigma Truncation.- 2.2.3 Power Law Scaling.- 2.2.4 Ranking.- 2.3 Selection Methods.- 2.4 Genetic Operations.- 2.4.1 Crossover.- 2.4.2 Mutation.- 2.4.3 Operational Rates Settings.- 2.4.4 Reordering.- 2.5 Replacement Scheme.- 2.6 A Game of Genetic Creatures.- 2.7 Chromosome Representation.- 2.8 Fitness Function.- 2.9 Genetic Operation.- 2.9.1 Selection Window for Functions and Parameters.- 2.10 Demo and Run.- 3. Intrinsic Characteristics.- 3.1 Parallel Genetic Algorithm.- 3.1.1 Global GA.- 3.1.2 Migration GA.- 3.1.3 Diffusion GA.- 3.2 Multiple Objective.- 3.3 Robustness.- 3.4 Multimodal.- 3.5 Constraints.- 3.5.1 Searching Domain.- 3.5.2 Repair Mechanism.- 3.5.3 Penalty Scheme.- 3.5.4 Specialized Genetic Operations.- 4. Hierarchical Genetic Algorithm.- 4.1 Biological Inspiration.- 4.1.1 Regulatory Sequences and Structural Genes.- 4.1.2 Active and Inactive Genes.- 4.2 Hierarchical Chromosome Formulation.- 4.3 Genetic Operations.- 4.4 Multiple Objective Approach.- 4.4.1 Iterative Approach.- 4.4.2 Group Technique.- 4.4.3 Multiple-Objective Ranking.- 5. Genetic Algorithms in Filtering.- 5.1 Digital IIR Filter Design.- 5.1.1 Chromosome Coding.- 5.1.2 The Lowest Filter Order Criterion.- 5.2 Time Delay Estimation.- 5.2.1 Problem Formulation.- 5.2.2 Genetic Approach.- 5.2.3 Results.- 5.3 Active Noise Control.- 5.3.1 Problem Formulation.- 5.3.2 Simple Genetic Algorithm.- 5.3.3 Multiobjective Genetic Algorithm Approach.- 5.3.4 Parallel Genetic Algorithm Approach.- 5.3.5 Hardware GA Processor.- 6. Genetic Algorithms in H-infinity Control.- 6.1 A Mixed Optimization Design Approach.- 6.1.1 Hierarchical Genetic Algorithm.- 6.1.2 Application I: The Distillation Column Design.- 6.1.3 Application II: Benchmark Problem.- 6.1.4 Design Comments.- 7. Hierarchical Genetic Algorithms in Computational Intelligence.- 7.1 Neural Networks.- 7.1.1 Introduction of Neural Network.- 7.1.2 HGA Trained Neural Network (HGANN).- 7.1.3 Simulation Results.- 7.1.4 Application of HGANN on Classification.- 7.2 Fuzzy Logic.- 7.2.1 Basic Formulation of Fuzzy Logic Controller.- 7.2.2 Hierarchical Structure.- 7.2.3 Application I: Water Pump System.- 7.2.4 Application II: Solar Plant.- 8. Genetic Algorithms in Speech Recognition Systems.- 8.1 Background of Speech Recognition Systems.- 8.2 Block Diagram of a Speech Recognition System.- 8.3 Dynamic Time Warping.- 8.4 Genetic Time Warping Algorithm (GTW).- 8.4.1 Encoding mechanism.- 8.4.2 Fitness function.- 8.4.3 Selection.- 8.4.4 Crossover.- 8.4.5 Mutation.- 8.4.6 Genetic Time Warping with Relaxed Slope Weighting Function (GTW-RSW).- 8.4.7 Hybrid Genetic Algorithm.- 8.4.8 Performance Evaluation.- 8.5 Hidden Markov Model using Genetic Algorithms.- 8.5.1 Hidden Markov Model.- 8.5.2 Training Discrete HMMs using Genetic Algorithms.- 8.5.3 Genetic Algorithm for Continuous HMM Training.- 8.6 A Multiprocessor System for Parallel Genetic Algorithms.- 8.6.1 Implementation.- 8.7 Global GA for Parallel GA-DTW and PGA-HMM.- 8.7.1 Experimental Results of Nonlinear Time-Normalization by the Parallel GA-DTW.- 8.8 Summary.- 9. Genetic Algorithms in Production Planning and Scheduling Problems.- 9.1 Background of Manufacturing Systems.- 9.2 ETPSP Scheme.- 9.2.1 ETPSP Model.- 9.2.2 Bottleneck Analysis.- 9.2.3 Selection of Key-Processes.- 9.3 Chromosome Configuration.- 9.3.1 Operational Parameters for GA Cycles.- 9.4 GA Application for ETPSP.- 9.4.1 Case 1: Two-product ETPSP.- 9.4.2 Case 2: Multi-product ETPSP.- 9.4.3 Case 3: MOGA Approach.- 9.5 Concluding Remarks.- 10. Genetic Algorithms in Communication Systems.- 10.1 Virtual Path Design in ATM.- 10.1.1 Problem Formulation.- 10.1.2 Average packet delay.- 10.1.3 Constraints.- 10.1.4 Combination Approach.- 10.1.5 Implementation.- 10.1.6 Results.- 10.2 Mesh Communication Network Design.- 10.2.1 Design of Mesh Communication Networks.- 10.2.2 Network Optimization using GA.- 10.2.3 Implementation.- 10.2.4 Results.- 10.3 Wireles Local Area Network Design.- 10.3.1 Problem Formulation.- 10.3.2 Multiobjective HGA Approach.- 10.3.3 Implementation.- 10.3.4 Results.- Appendix A.- Appendix B.- Appendix C.- Appendix D.- Appendix E.- Appendix F.- References.

626 citations


Journal ArticleDOI
TL;DR: This paper presents a technique that uses a genetic algorithm for automatic test‐data generation, a heuristic that mimics the evolution of natural species in searching for the optimal solution to a problem.
Abstract: This paper presents a technique that uses a genetic algorithm for automatic test-data generation. A genetic algorithm is a heuristic that mimics the evolution of natural species in searching for the optimal solution to a problem. In the test-data generation application, the solution sought by the genetic algorithm is test data that causes execution of a given statement, branch, path, or definition–use pair in the program under test. The test-data-generation technique was implemented in a tool called TGen, in which parallel processing was used to improve the performance of the search. To experiment with TGen, a random test-data generator called Random was also implemented. Both Tgen and Random were used to experiment with the generation of test-data for statement and branch coverage of six programs. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The results demonstrate that a genetic algorithm could be satisfactorily used in real time operations with stochastically generated inflows and the known global optimum for the four-reservoir problem can be achieved with real-value coding.
Abstract: Several alternative formulations of a genetic algorithm for reservoir systems are evaluated using the four-reservoir, deterministic, finite-horizon problem. This has been done with a view to presenting fundamental guidelines for implementation of the approach to practical problems. Alternative representation, selection, crossover, and mutation schemes are considered. It is concluded that the most promising genetic algorithm approach for the four-reservoir problem comprises real-value coding, tournament selection, uniform crossover, and modified uniform mutation. The real-value coding operates significantly faster than binary coding and produces better results. The known global optimum for the four-reservoir problem can be achieved with real-value coding. A nonlinear four-reservoir problem is considered also, along with one with extended time horizons. The results demonstrate that a genetic algorithm could be satisfactorily used in real time operations with stochastically generated inflows. A more complex ...

Book
01 Oct 1999
TL;DR: This important book addresses one of the most important optimization techniques in the industrial engineering/manufacturing area, the use of genetic algorithms to better design and produce reliable products of high quality.
Abstract: From the Publisher: Genetic algorithms are probabilistic search techniques based on the principles of biological evolution. As a biological organism evolves to more fully adapt to its environment, a genetic algorithm follows a path of analysis from which a design evolves, one that is optimal for the environmental constraints placed upon it. Written by two internationally-known experts on genetic algorithms and artificial intelligence, this important book addresses one of the most important optimization techniques in the industrial engineering/manufacturing area, the use of genetic algorithms to better design and produce reliable products of high quality. The book covers advanced optimization techniques as applied to manufacturing and industrial engineering processes, focusing on combinatorial and multiple-objective optimization problems that are most encountered in industry.

Journal ArticleDOI
TL;DR: In this paper, improvements are proposed to resource allocation and leveling heuristics, and the GA technique is used to search for near-optimum solution, considering both aspects simultaneously.
Abstract: Resource allocation and leveling are among the top challenges in project management. Due to the complexity of projects, resource allocation and leveling have been dealt with as two distinct subproblems solved mainly using heuristic procedures that cannot guarantee optimum solutions. In this paper, improvements are proposed to resource allocation and leveling heuristics, and the Genetic Algorithms (GAs) technique is used to search for near-optimum solution, considering both aspects simultaneously. In the improved heuristics, random priorities are introduced into selected tasks and their impact on the schedule is monitored. The GA procedure then searches for an optimum set of tasks' priorities that produces shorter project duration and better-leveled resource profiles. One major advantage of the procedure is its simple applicability within commercial project management software systems to improve their performance. With a widely used system as an example, a macro program is written to automate the GA proced...

Proceedings Article
18 Jul 1999
TL;DR: This paper presents an ensemble feature selection approach that is based on genetic algorithms and shows improved performance over the popular and powerful ensemble approaches of AdaBoost and Bagging and demonstrates the utility of ensemble features selection.
Abstract: The traditional motivation behind feature selection algorithms is to find the best subset of features for a task using one particular learning algonthm. Given the recent success of ensembles, however, we investigate the notion of ensemble feature selection in this paper. This task is harder than traditional feature selection in that one not only needs to find features germane to the learning task and learning algorithm, but one also needs to find a set of feature subsets that will promote disagreement among the ensemble's classifiers. In this paper, we present an ensemble feature selection approach that is based on genetic algorithms. Our algorithm shows improved performance over the popular and powerful ensemble approaches of AdaBoost and Bagging and demonstrates the utility of ensemble feature selection.

Journal ArticleDOI
TL;DR: For the test functions considered, the performance of FDA—in number of generations till convergence—is similar to that of a genetic algorithm for the OneMax function.
Abstract: In this paper the optimization of additively decomposed discrete functions is investigated. For these functions genetic algorithms have exhibited a poor performance. First the schema theory of genetic algorithms is reformulated in probability theory terms. A schema defines the structure of a marginal distribution. Then the conceptual algorithm BEDA is introduced. BEDA uses a Boltzmann distribution to generate search points. From BEDA a new algorithm, FDA, is derived. FDA uses a factorization of the distribution. The factorization captures the structure of the given function. The factorization problem is closely connected to the theory of conditional independence graphs. For the test functions considered, the performance of FDA—in number of generations till convergence—is similar to that of a genetic algorithm for the OneMax function. This result is theoretically explained.

Journal ArticleDOI
01 Apr 1999
TL;DR: In this article, the authors give a tutorial survey of recent works on various hybrid approaches in genetic job-shop scheduling practices, which provide very rich experiences for the constrained combinatorial optimization problems.
Abstract: Job-shop scheduling problem is one of the well-known hardest combinatorial optimization problems. During the last three decades, this problem has captured the interest of a significant number of researchers. A lot of literature has been published, but no efficient solution algorithm has been found yet for solving it to optimality in polynomial time. This has led to recent interest in using genetic algorithms to address the problem. How to adapt genetic algorithms to the job-shop scheduling problems is very challenging but frustrating. Many efforts have been made in order to give an efficient implementation of genetic algorithms to the problem. During the past decade, two important issues have been extensively studied. One is how to encode a solution of the problem into a chromosome so as to ensure that a chromosome will correspond to a feasible solution. The other issue is how to enhance the performance of genetic search by incorporating traditional heuristic methods. Because the genetic algorithms are not well suited for fine-tuning of solutions around optima, various methods of hybridization have been suggested to compensate for this shortcoming. The purpose of the paper is to give a tutorial survey of recent works on various hybrid approaches in genetic job-shop scheduling practices. The research on how to adapt the genetic algorithms to the job-shop scheduling problem provide very rich experiences for the constrained combinatorial optimization problems. All of the techniques developed for the problem are very useful for other scheduling problems in modern flexible manufacturing systems and other difficult-to-solve combinatorial optimization problems.

Journal ArticleDOI
TL;DR: This paper describes the application of an improved genetic algorithm (IGA) to deal with the solution of the transmission network expansion planning (TNEP) problem and reveals that GAs represent a promising approach for dealing with such a problem.
Abstract: This paper describes the application of an improved genetic algorithm (IGA) to deal with the solution of the transmission network expansion planning (TNEP) problem. Genetic algorithms (GAs) have demonstrated the ability to deal with nonconvex, nonlinear, integer-mixed optimization problems, like the TNEP problem, better than a number of mathematical methodologies. Some special features have been added to the basic genetic algorithm (GA) to improve its performance in solving the TNEP problem for three real-life, large-scale transmission systems. Results obtained reveal that GAs represent a promising approach for dealing with such a problem. In this paper, the theoretical issues of GA applied to this problem are emphasized.

Journal ArticleDOI
TL;DR: This work makes a formalization of these algorithms, and a timely and topic survey of their most important traditional and recent technical issues, and presents a useful summaries on their main applications.
Abstract: In this work we review the most important existing developments and future trends in the class of Parallel Genetic Algorithms (PGAs) PGAs are mainly subdivided into coarse and fine grain PGAs, the coarse grain models being the most popular ones An exceptional characteristic of PGAs is that they are not just the parallel version of a sequential algorithm intended to provide speed gains Instead, they represent a new kind of meta-heuristics of higher efficiency and efficacy thanks to their structured population and parallel execution The good robustness of these algorithms on problems of high complexity has led to an increasing number of applications in the fields of artificial intelligence, numeric and combinatorial optimization, business, engineering, etc We make a formalization of these algorithms, and present a timely and topic survey of their most important traditional and recent technical issues Besides that, useful summaries on their main applications plus Internet pointers to important web sites are included in order to help new researchers to access this growing area

Proceedings Article
13 Jul 1999
TL;DR: This paper explores the development of a GA that fulfills this requirement, and takes into account several aspects of the theory of GAs, including previous research work on population sizing, the schema theorem, building block mixing, and genetic drift.
Abstract: From the user's point of view, setting the parameters of a genetic algorithm (GA) is far from a trivial task. Moreover, the user is typically not interested in population sizes, crossover probabilities, selection rates, and other GA technicalities. He is just interested in solving a problem, and what he would really like to do, is to hand-in the problem to a blackbox algorithm, and simply press a start button. This paper explores the development of a GA that fulfills this requirement. It has no parameters whatsoever. The development of the algorithm takes into account several aspects of the theory of GAs, including previous research work on population sizing, the schema theorem, building block mixing, and genetic drift.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a procedure for fitting experimental and simulated X-ray reflectivity and diffraction data in order to automate and quantify the characterization of thin-film structures.
Abstract: We have developed a procedure for fitting experimental and simulated X–ray reflectivity and diffraction data in order to automate and to quantify the characterization of thin–film structures. The optimization method employed is a type of genetic algorithm called‘differential evolution’. The method is capable of rapid convergence to the global minimum of an error function in parameter space even when there are many local minima in addition to the global minimum. We show how to estimate the pointwise errors of the optimized parameters, and how to determine whether the model adequately represents the structure. The procedure is capable of fitting some tens of adjustable parameters, given suitable data.

Journal ArticleDOI
TL;DR: In this article, the problem of selecting the parameters of power system stabilizers which simultaneously stabilize this set of plants is converted to a simple optimization problem which is solved by a genetic algorithm with an eigenvalue-based objective function.
Abstract: This paper demonstrates the use of genetic algorithms for the simultaneous stabilization of multimachine power systems over a wide range of operating conditions via single-setting power system stabilizers. The power system operating at various conditions is treated as a finite set of plants. The problem of selecting the parameters of power system stabilizers which simultaneously stabilize this set of plants is converted to a simple optimization problem which is solved by a genetic algorithm with an eigenvalue-based objective function. Two objective functions are presented, allowing the selection of the stabilizer parameters to shift some of the closed-loop eigenvalues to the left-hand side of a vertical line in the complex s-plane, or to a wedge-shape sector in the complex s-plane. The effectiveness of the suggested technique in damping local and inter-area modes of oscillations in multimachine power systems is verified through eigenvalue analysis and simulation results.

Journal ArticleDOI
TL;DR: The results indicate that for practical problem sizes, the orthogonal genetic algorithm can find near optimal solutions within moderate numbers of generations.
Abstract: Many multimedia communication applications require a source to send multimedia information to multiple destinations through a communication network. To support these applications, it is necessary to determine a multicast tree of minimal cost to connect the source node to the destination nodes subject to delay constraints on multimedia communication. This problem is known as multimedia multicast routing and has been proved to be NP-complete. The paper proposes an orthogonal genetic algorithm for multimedia multicast routing. Its salient feature is to incorporate an experimental design method called orthogonal design into the crossover operation. As a result, it can search the solution space in a statistically sound manner and it is well suited for parallel implementation and execution. We execute the orthogonal genetic algorithm to solve two sets of benchmark test problems. The results indicate that for practical problem sizes, the orthogonal genetic algorithm can find near optimal solutions within moderate numbers of generations.

Proceedings ArticleDOI
12 Apr 1999
TL;DR: A collection of eleven heuristics from the literature has been selected, implemented, and analyzed under one set of common assumptions and provides one even basis for comparison and insights into circumstances where one technique will outperform another.
Abstract: Heterogeneous computing (HC) environments are well suited to meet the computational demands of large, diverse groups of tasks (i.e., a meta-task). The problem of mapping (defined as matching and scheduling) these tasks onto the machines of an HC environment has been shown, in general, to be NP-complete, requiring the development of heuristic techniques. Selecting the best heuristic to use in a given environment, however, remains a difficult problem, because comparisons are often clouded by different underlying assumptions in the original studies of each heuristic. Therefore, a collection of eleven heuristics from the literature has been selected, implemented, and analyzed under one set of common assumptions. The eleven heuristics examined are opportunistic load balancing, user-directed assignment, fast greedy, min-min, max-min, greedy, genetic algorithm, simulated annealing, genetic simulated annealing, tabu, and A*. This study provides one even basis for comparison and insights into circumstances where one technique will outperform another. The evaluation procedure is specified, the heuristics are defined, and then selected results are compared.

Journal ArticleDOI
01 Apr 1999
TL;DR: This paper presents a neuro-fuzzy logic controller where all of its parameters can be tuned simultaneously by GA, and shows that the proposed controller offers encouraging advantages and has better performance.
Abstract: Due to their powerful optimization property, genetic algorithms (GAs) are currently being investigated for the development of adaptive or self-tuning fuzzy logic control systems. This paper presents a neuro-fuzzy logic controller (NFLC) where all of its parameters can be tuned simultaneously by GA. The structure of the controller is based on the radial basis function neural network (RBF) with Gaussian membership functions. The NFLC tuned by GA can somewhat eliminate laborious design steps such as manual tuning of the membership functions and selection of the fuzzy rules. The GA implementation incorporates dynamic crossover and mutation probabilistic rates for faster convergence. A flexible position coding strategy of the NFLC parameters is also implemented to obtain near optimal solutions. The performance of the proposed controller is compared with a conventional fuzzy controller and a PID controller tuned by GA. Simulation results show that the proposed controller offers encouraging advantages and has better performance.

Journal ArticleDOI
TL;DR: A modification of the initial iterative approach used in SLAVE to include more information in the process of learning one individual rule and the use of a new fitness function and additional genetic operators that reduce the time needed for learning and improve the understanding of the rules obtained.
Abstract: SLAVE is an inductive learning algorithm that uses concepts based on fuzzy logic theory. This theory has been shown to be a useful representational tool for improving the understanding of the knowledge obtained from a human point of view. Furthermore, SLAVE uses an iterative approach for learning based on the use of a genetic algorithm (GA) as a search algorithm. We propose a modification of the initial iterative approach used in SLAVE. The main idea is to include more information in the process of learning one individual rule. This information is included in the iterative approach through a different proposal of calculus of the positive and negative example to a rule. Furthermore, we propose the use of a new fitness function and additional genetic operators that reduce the time needed for learning and improve the understanding of the rules obtained.

Journal ArticleDOI
TL;DR: A flexible approach using the genetic algorithm (GA) is proposed for array failure correction in digital beamforming of arbitrary arrays, and three mating schemes, adjacent-fitness-paring, best-mate-worst, and emperor-selective are proposed and their performances are studied.
Abstract: A flexible approach using the genetic algorithm (GA) is proposed for array failure correction in digital beamforming of arbitrary arrays. In this approach, beamforming weights of an array are represented directly by a vector of complex numbers. The decimal linear crossover is employed so that no binary coding and decoding is necessary. Three mating schemes, adjacent-fitness-paring (AFP), best-mate-worst (BMW), and emperor-selective (EMS), are proposed and their performances are studied. Near-solutions from other analytic or heuristic techniques may be injected into the initial population to speed up convergence. Numerical examples of single- and multiple-element failure correction are presented to show the effectiveness of the approach.

Proceedings ArticleDOI
31 Aug 1999
TL;DR: The state of the art on PGAs is reviewed and a new taxonomy also including a new form of PGA (the dynamic deme model) which was recently developed is proposed.
Abstract: Genetic algorithms (GAs) are powerful search techniques that are used to solve difficult problems in many disciplines. Unfortunately, they can be very demanding in terms of computation load and memory. Parallel genetic algorithms (PGAs) are parallel implementations of GAs which can provide considerable gains in terms of performance and scalability. PGAs can easily be implemented on networks of heterogeneous computers or on parallel mainframes. We review the state of the art on PGAs and propose a new taxonomy also including a new form of PGA (the dynamic deme model) which was recently developed.

Journal ArticleDOI
TL;DR: An improved BL-algorithm for genetic algorithm of the orthogonal packing of rectangles with improvements on the fitness function and solutions of two numerical examples show the effectiveness of these improvements.

Journal ArticleDOI
TL;DR: This paper examines two well known global search techniques, Simulated Annealing and the Genetic Algorithm, and compares their performance, and a Monte Carlo study was conducted in order to test the appropriateness of theseglobal search techniques for optimizing neural networks.