scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Taxonomy of Hybrid Metaheuristics

01 Sep 2002-Journal of Heuristics (Kluwer Academic Publishers)-Vol. 8, Iss: 5, pp 541-564
TL;DR: A taxonomy of hybrid metaheuristics is presented in an attempt to provide a common terminology and classification mechanisms and is also applicable to most types of heuristics and exact optimization algorithms.
Abstract: Hybrid metaheuristics have received considerable interest these recent years in the field of combinatorial optimization. A wide variety of hybrid approaches have been proposed in the literature. In this paper, a taxonomy of hybrid metaheuristics is presented in an attempt to provide a common terminology and classification mechanisms. The taxonomy, while presented in terms of metaheuristics, is also applicable to most types of heuristics and exact optimization algorithms. As an illustration of the usefulness of the taxonomy an annoted bibliography is given which classifies a large number of hybrid approaches according to the taxonomy.
Citations
More filters
Journal ArticleDOI
TL;DR: A survey of the nowadays most important metaheuristics from a conceptual point of view and introduces a framework, that is called the I&D frame, in order to put different intensification and diversification components into relation with each other.
Abstract: The field of metaheuristics for the application to combinatorial optimization problems is a rapidly growing field of research. This is due to the importance of combinatorial optimization problems for the scientific as well as the industrial world. We give a survey of the nowadays most important metaheuristics from a conceptual point of view. We outline the different components and concepts that are used in the different metaheuristics in order to analyze their similarities and differences. Two very important concepts in metaheuristics are intensification and diversification. These are the two forces that largely determine the behavior of a metaheuristic. They are in some way contrary but also complementary to each other. We introduce a framework, that we call the I&D frame, in order to put different intensification and diversification components into relation with each other. Outlining the advantages and disadvantages of different metaheuristic approaches we conclude by pointing out the importance of hybridization of metaheuristics as well as the integration of metaheuristics and other methods for optimization.

3,287 citations

Book
22 Jun 2009
TL;DR: This book provides a complete background on metaheuristics and shows readers how to design and implement efficient algorithms to solve complex optimization problems across a diverse range of applications, from networking and bioinformatics to engineering design, routing, and scheduling.
Abstract: A unified view of metaheuristics This book provides a complete background on metaheuristics and shows readers how to design and implement efficient algorithms to solve complex optimization problems across a diverse range of applications, from networking and bioinformatics to engineering design, routing, and scheduling. It presents the main design questions for all families of metaheuristics and clearly illustrates how to implement the algorithms under a software framework to reuse both the design and code. Throughout the book, the key search components of metaheuristics are considered as a toolbox for: Designing efficient metaheuristics (e.g. local search, tabu search, simulated annealing, evolutionary algorithms, particle swarm optimization, scatter search, ant colonies, bee colonies, artificial immune systems) for optimization problems Designing efficient metaheuristics for multi-objective optimization problems Designing hybrid, parallel, and distributed metaheuristics Implementing metaheuristics on sequential and parallel machines Using many case studies and treating design and implementation independently, this book gives readers the skills necessary to solve large-scale optimization problems quickly and efficiently. It is a valuable reference for practicing engineers and researchers from diverse areas dealing with optimization or machine learning; and graduate students in computer science, operations research, control, engineering, business and management, and applied mathematics.

2,735 citations


Cites background from "A Taxonomy of Hybrid Metaheuristics..."

  • ...The best results found for many real-life or classical optimization problems are obtained by hybrid algorithms [747]....

    [...]

Journal ArticleDOI
TL;DR: A fresh treatment is introduced that classifies and discusses existing work within three rational aspects: what and how EA components contribute to exploration and exploitation; when and how Exploration and exploitation are controlled; and how balance between exploration and exploited is achieved.
Abstract: “Exploration and exploitation are the two cornerstones of problem solving by search.” For more than a decade, Eiben and Schippers' advocacy for balancing between these two antagonistic cornerstones still greatly influences the research directions of evolutionary algorithms (EAs) [1998]. This article revisits nearly 100 existing works and surveys how such works have answered the advocacy. The article introduces a fresh treatment that classifies and discusses existing work within three rational aspects: (1) what and how EA components contribute to exploration and exploitation; (2) when and how exploration and exploitation are controlled; and (3) how balance between exploration and exploitation is achieved. With a more comprehensive and systematic understanding of exploration and exploitation, more research in this direction may be motivated and refined.

1,029 citations

Journal ArticleDOI
TL;DR: The experimental results confirm the efficiency of the proposed approaches in improving the classification accuracy compared to other wrapper-based algorithms, which insures the ability of WOA algorithm in searching the feature space and selecting the most informative attributes for classification tasks.

853 citations

Journal ArticleDOI
TL;DR: A modern vision of the parallelization techniques used for evolutionary algorithms (EAs) and provides a highly structured background relating to PEAs to make researchers aware of the benefits of decentralizing and parallelizing an EA.
Abstract: This paper contains a modern vision of the parallelization techniques used for evolutionary algorithms (EAs). The work is motivated by two fundamental facts: 1) the different families of EAs have naturally converged in the last decade while parallel EAs (PEAs) are still lack of unified studies; and 2) there is a large number of improvements in these algorithms and in their parallelization that raise the need for a comprehensive survey. We stress the differences between the EA model and its parallel implementation throughout the paper. We discuss the advantages and drawbacks of PEAs. Also, successful applications are mentioned and open problems are identified. We propose potential solutions to these problems and classify the different ways in which recent results in theory and practice are helping to solve them. Finally, we provide a highly structured background relating to PEAs in order to make researchers aware of the benefits of decentralizing and parallelizing an EA.

810 citations


Cites methods from "A Taxonomy of Hybrid Metaheuristics..."

  • ...Several other heuristics such as simulated annealing [74], tabu search [48], and their combinations and variations [123] have been used with comparable results but will not be reviewed here....

    [...]

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Book
01 Jan 1975
TL;DR: Names of founding work in the area of Adaptation and modiication, which aims to mimic biological optimization, and some (Non-GA) branches of AI.
Abstract: Name of founding work in the area. Adaptation is key to survival and evolution. Evolution implicitly optimizes organisims. AI wants to mimic biological optimization { Survival of the ttest { Exploration and exploitation { Niche nding { Robust across changing environments (Mammals v. Dinos) { Self-regulation,-repair and-reproduction 2 Artiicial Inteligence Some deenitions { "Making computers do what they do in the movies" { "Making computers do what humans (currently) do best" { "Giving computers common sense; letting them make simple deci-sions" (do as I want, not what I say) { "Anything too new to be pidgeonholed" Adaptation and modiication is root of intelligence Some (Non-GA) branches of AI: { Expert Systems (Rule based deduction)

32,573 citations


"A Taxonomy of Hybrid Metaheuristics..." refers background in this paper

  • ...…different metaheuristics Ai treated in our examples are: SA (Simulated Annealing) (Kirkpatrick, Gelatt, and Vecchi, 1983), GA (Genetic Algorithms) (Holland, 1975), ES (Evolution Strategies (Rechenberg, 1973), EP (Evolutionary programming), GP (Genetic Programming) (Koza, 1992), NN (Neural…...

    [...]

  • ...Those metaheuristics include evolutionary algorithms (EA: genetic algorithms (GA) (Holland, 1975), evolution strategies (ES) (Rechenberg, 1973), genetic programming (Koza, 1992), etc.), ant colonies (AC) (Colorni, Dorigo, and Maniezzo, 1991) scatter search (SS) (Glover, 1977), and so on....

    [...]

  • ...The different metaheuristics Ai treated in our examples are: SA (Simulated Annealing) (Kirkpatrick, Gelatt, and Vecchi, 1983), GA (Genetic Algorithms) (Holland, 1975), ES (Evolution Strategies (Rechenberg, 1973), EP (Evolutionary programming), GP (Genetic Programming) (Koza, 1992), NN (Neural Networks), LS (Descent Local Search) (Papadimitriou and Steiglitz, 1982), TS (Tabu Search) (Glover, 1989), GH (Greedy Heuristic) (Lawler, 1976), AC (Ant Colonies) (Colorni, Dorigo, and Maniezzo, 1991), SS (Scatter Search) (Glover, 1977), NM (Noisy Method) (Charon and Hudry, 1993), and CLP (Constraing Logic Programming) (Hentenryck, 1989)....

    [...]

  • ...Those metaheuristics include evolutionary algorithms (EA: genetic algorithms (GA) (Holland, 1975), evolution strategies (ES) (Rechenberg, 1973), genetic programming (Koza, 1992), etc....

    [...]

Book ChapterDOI
01 Jan 1988
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Abstract: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion

17,604 citations


"A Taxonomy of Hybrid Metaheuristics..." refers methods in this paper

  • ...Similarly, models of learning are often equated with techniques for local optimization (Rumelhart, Hinton, and Williams, 1986)....

    [...]

Book
01 Jan 2002

17,039 citations