scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Genetic algorithms and their applications

TL;DR: The genetic algorithm is introduced as an emerging optimization algorithm for signal processing and a number of applications, such as IIR adaptive filtering, time delay estimation, active noise control, and speech processing, that are being successfully implemented are described.
Abstract: This article introduces the genetic algorithm (GA) as an emerging optimization algorithm for signal processing. After a discussion of traditional optimization techniques, it reviews the fundamental operations of a simple GA and discusses procedures to improve its functionality. The properties of the GA that relate to signal processing are summarized, and a number of applications, such as IIR adaptive filtering, time delay estimation, active noise control, and speech processing, that are being successfully implemented are described.
Citations
More filters
Book
01 Jan 1996
TL;DR: An Introduction to Genetic Algorithms focuses in depth on a small set of important and interesting topics -- particularly in machine learning, scientific modeling, and artificial life -- and reviews a broad span of research, including the work of Mitchell and her colleagues.
Abstract: From the Publisher: "This is the best general book on Genetic Algorithms written to date. It covers background, history, and motivation; it selects important, informative examples of applications and discusses the use of Genetic Algorithms in scientific models; and it gives a good account of the status of the theory of Genetic Algorithms. Best of all the book presents its material in clear, straightforward, felicitous prose, accessible to anyone with a college-level scientific background. If you want a broad, solid understanding of Genetic Algorithms -- where they came from, what's being done with them, and where they are going -- this is the book. -- John H. Holland, Professor, Computer Science and Engineering, and Professor of Psychology, The University of Michigan; External Professor, the Santa Fe Institute. Genetic algorithms have been used in science and engineering as adaptive algorithms for solving practical problems and as computational models of natural evolutionary systems. This brief, accessible introduction describes some of the most interesting research in the field and also enables readers to implement and experiment with genetic algorithms on their own. It focuses in depth on a small set of important and interesting topics -- particularly in machine learning, scientific modeling, and artificial life -- and reviews a broad span of research, including the work of Mitchell and her colleagues. The descriptions of applications and modeling projects stretch beyond the strict boundaries of computer science to include dynamical systems theory, game theory, molecular biology, ecology, evolutionary biology, and population genetics, underscoring the exciting "general purpose" nature of genetic algorithms as search methods that can be employed across disciplines. An Introduction to Genetic Algorithms is accessible to students and researchers in any scientific discipline. It includes many thought and computer exercises that build on and reinforce the reader's understanding of the text. The first chapter introduces genetic algorithms and their terminology and describes two provocative applications in detail. The second and third chapters look at the use of genetic algorithms in machine learning (computer programs, data analysis and prediction, neural networks) and in scientific models (interactions among learning, evolution, and culture; sexual selection; ecosystems; evolutionary activity). Several approaches to the theory of genetic algorithms are discussed in depth in the fourth chapter. The fifth chapter takes up implementation, and the last chapter poses some currently unanswered questions and surveys prospects for the future of evolutionary computation.

9,933 citations


Cites background or methods from "Genetic algorithms and their applic..."

  • ...The problems encountered by our GA on R 1 illustrate very clearly the kinds of "biased sampling" problems described by Grefenstette (1991b)....

    [...]

  • ...Somewhat later, Grefenstette (1986) noted that, since the GA could be used as an optimization procedure, it could be used to optimize the parameters for another GA! (A similar study was done by Bramlette (1991).) In Grefenstette's experiments, the "meta−level GA" evolved a population of 50 GA parameter sets for the problems in De Jong's test suite. Each individual encoded six GA parameters: population size, crossover rate, mutation rate, generation gap, scaling window (a particular scaling technique that I won't discuss here), and selection strategy (elitist or nonelitist). The fitness of an individual was a function of the on−line or off−line performance of a GA using the parameters encoded by that individual. The meta−level GA itself used De Jong's parameter settings. The fittest individual for on−line performance set the population size to 30, the crossover rate to 0.95, the mutation rate to 0.01, and the generation gap to 1, and used elitist selection. These parameters gave a small but significant improvement in on−line performance over De Jong's settings. Notice that Grefenstette's results call for a smaller population and higher crossover and mutation rates than De Jong's. The meta−level GA was not able to find a parameter set that beat De Jong's for off−line performance. This was an interesting experiment, but again, in view of the specialized test suite, it is not clear how generally these recommendations hold. Others have shown that there are many fitness functions for which these parameter settings are not optimal. Schaffer, Caruana, Eshelman, and Das (1989) spent over a year of CPU time systematically testing a wide range of parameter combinations....

    [...]

  • ...Somewhat later, Grefenstette (1986) noted that, since the GA could be used as an optimization procedure, it could be used to optimize the parameters for another GA! (A similar study was done by Bramlette (1991)....

    [...]

  • ...Grefenstette (1991b) gives some further caveats concerning this description of GA behavior....

    [...]

  • ...Grefenstette and Baker (1989) illustrate this with the following fitness function:...

    [...]

Journal ArticleDOI
TL;DR: A new optimization algorithm based on the law of gravity and mass interactions is introduced and the obtained results confirm the high performance of the proposed method in solving various nonlinear functions.

5,501 citations


Cites background or methods from "Genetic algorithms and their applic..."

  • ...Various heuristic approaches have been adopted by researches so far, for example Genetic Algorithm [32], Simulated Annealing [21], Ant Colony Search Algorithm [5], Particle Swarm Optimization [17], etc....

    [...]

  • ...Genetic Algorithm, GA, are inspired from Darwinian evolutionary theory [32], Simulated Annealing, SA, is designed by use of thermodynamic effects [21], Artificial Immune Systems, AIS, simulate biological immune systems [8], Ant Colony Optimization, ACO, mimics the behavior of ants foraging for food [5], Bacterial Foraging Algorithm, BFA, comes from search and optimal foraging of bacteria [11,19] and Particle Swarm Optimization, PSO, simulates the behavior of flock of birds [3,17]....

    [...]

  • ...A comparison with Real Genetic Algorithm (RGA) and PSO, is given in Section 6.2....

    [...]

  • ...Over the last decades, there has been a growing interest in algorithms inspired by the behaviors of natural phenomena [5,8,17,19,21,32]....

    [...]

  • ...Some of most famous of these algorithms are Genetic Algorithm, Simulated Annealing [21], Artificial Immune System [8], Ant Colony Optimization, Particle Swarm Optimization [17] and Bacterial Foraging Algorithm [11]....

    [...]

Journal ArticleDOI
TL;DR: A Composite PSO, in which the heuristic parameters of PSO are controlled by a Differential Evolution algorithm during the optimization, is described, and results for many well-known and widely used test functions are given.
Abstract: This paper presents an overview of our most recent results concerning the Particle Swarm Optimization (PSO) method. Techniques for the alleviation of local minima, and for detecting multiple minimizers are described. Moreover, results on the ability of the PSO in tackling Multiobjective, Minimax, Integer Programming and e1 errors-in-variables problems, as well as problems in noisy and continuously changing environments, are reported. Finally, a Composite PSO, in which the heuristic parameters of PSO are controlled by a Differential Evolution algorithm during the optimization, is described, and results for many well-known and widely used test functions are given.

1,436 citations

Journal ArticleDOI
01 Jan 1998
TL;DR: In this article, a multiobjective genetic algorithm based on the proposed decision strategy is proposed, and a suitable decision making framework based on goals and priorities is subsequently formulated in terms of a relational operator, characterized and shown to encompass a number of simpler decision strategies.
Abstract: In optimization, multiple objectives and constraints cannot be handled independently of the underlying optimizer. Requirements such as continuity and differentiability of the cost surface add yet another conflicting element to the decision process. While "better" solutions should be rated higher than "worse" ones, the resulting cost landscape must also comply with such requirements. Evolutionary algorithms (EAs), which have found application in many areas not amenable to optimization by other methods, possess many characteristics desirable in a multiobjective optimizer, most notably the concerted handling of multiple candidate solutions. However, EAs are essentially unconstrained search techniques which require the assignment of a scalar measure of quality, or fitness, to such candidate solutions. After reviewing current revolutionary approaches to multiobjective and constrained optimization, the paper proposes that fitness assignment be interpreted as, or at least related to, a multicriterion decision process. A suitable decision making framework based on goals and priorities is subsequently formulated in terms of a relational operator, characterized, and shown to encompass a number of simpler decision strategies. Finally, the ranking of an arbitrary number of candidates is considered. The effect of preference changes on the cost surface seen by an EA is illustrated graphically for a simple problem. The paper concludes with the formulation of a multiobjective genetic algorithm based on the proposed decision strategy. Niche formation techniques are used to promote diversity among preferable candidates, and progressive articulation of preferences is shown to be possible as long as the genetic algorithm can recover from abrupt changes in the cost landscape.

1,175 citations

Journal ArticleDOI
TL;DR: An improved ABC algorithm called gbest-guided ABC (GABC) algorithm is proposed by incorporating the information of global best (gbest) solution into the solution search equation to improve the exploitation of ABC algorithm.

1,105 citations


Cites background or methods from "Genetic algorithms and their applic..."

  • ...The experimental results show that ABC algorithm is competitive with some conventional optimization algorithms, such as GA [1,2], DE [12], and PSO [3,4]....

    [...]

  • ...By now, there have been several kinds of biological-inspired optimization algorithms, such as genetic algorithm (GA) inspired by the Darwinian law of survival of the fittest [1,2], particle swarm optimization (PSO) inspired by the social behavior of bird flocking or fish schooling [3,4], ant colony optimization (ACO) inspired by the foraging behavior of ant colonies [5], and Biogeography-Based Optimization (BBO) inspired by the migration behavior of island species [6]....

    [...]

  • ...As well known that both exploration and exploitation are necessary for the population-based optimization algorithms, such as GA [1,2], PSO [3,4], DE [12], and so on....

    [...]

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Journal ArticleDOI
H. Sakoe1, S. Chiba1
TL;DR: This paper reports on an optimum dynamic progxamming (DP) based time-normalization algorithm for spoken word recognition, in which the warping function slope is restricted so as to improve discrimination between words in different categories.
Abstract: This paper reports on an optimum dynamic progxamming (DP) based time-normalization algorithm for spoken word recognition. First, a general principle of time-normalization is given using time-warping function. Then, two time-normalized distance definitions, called symmetric and asymmetric forms, are derived from the principle. These two forms are compared with each other through theoretical discussions and experimental studies. The symmetric form algorithm superiority is established. A new technique, called slope constraint, is successfully introduced, in which the warping function slope is restricted so as to improve discrimination between words in different categories. The effective slope constraint characteristic is qualitatively analyzed, and the optimum slope constraint condition is determined through experiments. The optimized algorithm is then extensively subjected to experimental comparison with various DP-algorithms, previously applied to spoken word recognition by different research groups. The experiment shows that the present algorithm gives no more than about two-thirds errors, even compared to the best conventional algorithm.

5,906 citations

Journal ArticleDOI
TL;DR: A scheme for image compression that takes into account psychovisual features both in the space and frequency domains is proposed and it is shown that the wavelet transform is particularly well adapted to progressive transmission.
Abstract: A scheme for image compression that takes into account psychovisual features both in the space and frequency domains is proposed. This method involves two steps. First, a wavelet transform used in order to obtain a set of biorthogonal subclasses of images: the original image is decomposed at different scales using a pyramidal algorithm architecture. The decomposition is along the vertical and horizontal directions and maintains constant the number of pixels required to describe the image. Second, according to Shannon's rate distortion theory, the wavelet coefficients are vector quantized using a multiresolution codebook. To encode the wavelet coefficients, a noise shaping bit allocation procedure which assumes that details at high resolution are less visible to the human eye is proposed. In order to allow the receiver to recognize a picture as quickly as possible at minimum cost, a progressive transmission scheme is presented. It is shown that the wavelet transform is particularly well adapted to progressive transmission. >

3,925 citations

Journal ArticleDOI
15 Jul 1977-Science
TL;DR: First-order nonlinear differential-delay equations describing physiological control systems displaying a broad diversity of dynamical behavior including limit cycle oscillations, with a variety of wave forms, and apparently aperiodic or "chaotic" solutions are studied.
Abstract: First-order nonlinear differential-delay equations describing physiological control systems are studied. The equations display a broad diversity of dynamical behavior including limit cycle oscillations, with a variety of wave forms, and apparently aperiodic or "chaotic" solutions. These results are discussed in relation to dynamical respiratory and hematopoietic diseases.

3,839 citations

Journal ArticleDOI
Arnaud E. Jacquin1
TL;DR: The author proposes an independent and novel approach to image coding, based on a fractal theory of iterated transformations, that relies on the assumption that image redundancy can be efficiently exploited through self-transformability on a block-wise basis and approximates an original image by a Fractal image.
Abstract: The author proposes an independent and novel approach to image coding, based on a fractal theory of iterated transformations. The main characteristics of this approach are that (i) it relies on the assumption that image redundancy can be efficiently exploited through self-transformability on a block-wise basis, and (ii) it approximates an original image by a fractal image. The author refers to the approach as fractal block coding. The coding-decoding system is based on the construction, for an original image to encode, of a specific image transformation-a fractal code-which, when iterated on any initial image, produces a sequence of images that converges to a fractal approximation of the original. It is shown how to design such a system for the coding of monochrome digital images at rates in the range of 0.5-1.0 b/pixel. The fractal block coder has performance comparable to state-of-the-art vector quantizers. >

1,386 citations