scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Tabu Search—Part II

01 Aug 1989-Informs Journal on Computing (INFORMS)-Vol. 2, Iss: 1, pp 4-32
TL;DR: The elements of staged search and structured move sets are characterized, which bear on the issue of finiteness, and new dynamic strategies for managing tabu lists are introduced, allowing fuller exploitation of underlying evaluation functions.
Abstract: This is the second half of a two part series devoted to the tabu search metastrategy for optimization problems. Part I introduced the fundamental ideas of tabu search as an approach for guiding other heuristics to overcome the limitations of local optimality, both in a deterministic and a probabilistic framework. Part I also reported successful applications from a wide range of settings, in which tabu search frequently made it possible to obtain higher quality solutions than previously obtained with competing strategies, generally with less computational effort. Part II, in this issue, examines refinements and more advanced aspects of tabu search. Following a brief review of notation, Part II introduces new dynamic strategies for managing tabu lists, allowing fuller exploitation of underlying evaluation functions. In turn, the elements of staged search and structured move sets are characterized, which bear on the issue of finiteness. Three ways of applying tabu search to the solution of integer programmin...
Citations
More filters
Journal ArticleDOI
01 Feb 1996
TL;DR: It is shown how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling, and the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.
Abstract: An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

11,224 citations


Cites methods from "Tabu Search—Part II"

  • ...To run the comparisons, we implemented a Simulated Annealing (SA) [1], and a Tabu Search (TS) [17], [18]; we let each of them run 10 times on the Oliver30 data....

    [...]

  • ...This process is iterated until the tour counter reaches the maximum (user2 Even though the name chosen recalls tabu search, proposed in [17,18], there are substantial differences between our approach and tabu search algorithms....

    [...]

Journal ArticleDOI
TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Abstract: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving. A number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class. These theorems result in a geometric interpretation of what it means for an algorithm to be well suited to an optimization problem. Applications of the NFL theorems to information-theoretic aspects of optimization and benchmark measures of performance are also presented. Other issues addressed include time-varying optimization problems and a priori "head-to-head" minimax distinctions between optimization algorithms, distinctions that result despite the NFL theorems' enforcing of a type of uniformity over all algorithms.

10,771 citations

Book
01 Jan 1996
TL;DR: An Introduction to Genetic Algorithms focuses in depth on a small set of important and interesting topics -- particularly in machine learning, scientific modeling, and artificial life -- and reviews a broad span of research, including the work of Mitchell and her colleagues.
Abstract: From the Publisher: "This is the best general book on Genetic Algorithms written to date. It covers background, history, and motivation; it selects important, informative examples of applications and discusses the use of Genetic Algorithms in scientific models; and it gives a good account of the status of the theory of Genetic Algorithms. Best of all the book presents its material in clear, straightforward, felicitous prose, accessible to anyone with a college-level scientific background. If you want a broad, solid understanding of Genetic Algorithms -- where they came from, what's being done with them, and where they are going -- this is the book. -- John H. Holland, Professor, Computer Science and Engineering, and Professor of Psychology, The University of Michigan; External Professor, the Santa Fe Institute. Genetic algorithms have been used in science and engineering as adaptive algorithms for solving practical problems and as computational models of natural evolutionary systems. This brief, accessible introduction describes some of the most interesting research in the field and also enables readers to implement and experiment with genetic algorithms on their own. It focuses in depth on a small set of important and interesting topics -- particularly in machine learning, scientific modeling, and artificial life -- and reviews a broad span of research, including the work of Mitchell and her colleagues. The descriptions of applications and modeling projects stretch beyond the strict boundaries of computer science to include dynamical systems theory, game theory, molecular biology, ecology, evolutionary biology, and population genetics, underscoring the exciting "general purpose" nature of genetic algorithms as search methods that can be employed across disciplines. An Introduction to Genetic Algorithms is accessible to students and researchers in any scientific discipline. It includes many thought and computer exercises that build on and reinforce the reader's understanding of the text. The first chapter introduces genetic algorithms and their terminology and describes two provocative applications in detail. The second and third chapters look at the use of genetic algorithms in machine learning (computer programs, data analysis and prediction, neural networks) and in scientific models (interactions among learning, evolution, and culture; sexual selection; ecosystems; evolutionary activity). Several approaches to the theory of genetic algorithms are discussed in depth in the fourth chapter. The fifth chapter takes up implementation, and the last chapter poses some currently unanswered questions and surveys prospects for the future of evolutionary computation.

9,933 citations

Journal ArticleDOI
TL;DR: Optimization results prove that the WOA algorithm is very competitive compared to the state-of-art meta-heuristic algorithms as well as conventional methods.

7,090 citations

Book
01 Jan 2004
TL;DR: Ant colony optimization (ACO) is a relatively new approach to problem solving that takes inspiration from the social behaviors of insects and of other animals as discussed by the authors In particular, ants have inspired a number of methods and techniques among which the most studied and the most successful is the general purpose optimization technique known as ant colony optimization.
Abstract: Swarm intelligence is a relatively new approach to problem solving that takes inspiration from the social behaviors of insects and of other animals In particular, ants have inspired a number of methods and techniques among which the most studied and the most successful is the general purpose optimization technique known as ant colony optimization Ant colony optimization (ACO) takes inspiration from the foraging behavior of some ant species These ants deposit pheromone on the ground in order to mark some favorable path that should be followed by other members of the colony Ant colony optimization exploits a similar mechanism for solving optimization problems From the early nineties, when the first ant colony optimization algorithm was proposed, ACO attracted the attention of increasing numbers of researchers and many successful applications are now available Moreover, a substantial corpus of theoretical results is becoming available that provides useful guidelines to researchers and practitioners in further applications of ACO The goal of this article is to introduce ant colony optimization and to survey its most notable applications

6,861 citations

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Book
01 Jan 1972
TL;DR: The principles of integer programming are directed toward finding solutions to problems from the fields of economic planning, engineering design, and combinatorial optimization as mentioned in this paper, which is a standard of graduate-level courses since 1972.
Abstract: The principles of integer programming are directed toward finding solutions to problems from the fields of economic planning, engineering design, and combinatorial optimization. This highly respected and much-cited text, a standard of graduate-level courses since 1972, presents a comprehensive treatment of the first two decades of research on integer programming.

4,336 citations

Journal ArticleDOI
TL;DR: Four key areas of Integer programming are examined from a framework that links the perspectives of artificial intelligence and operations research, and each has characteristics that appear usefully relevant to developments on the horizon.

3,985 citations

01 Jan 1985
TL;DR: This study tested human performance on a real and virtual floor, as well as in a threedimensional (3D) virtual space, and modeled these results by a graph pyramid algorithm, which suggests that deterioration of performance in the 3D space can be attributed to geometrical relations between hierarchical clustering in a3D space and coarse-to-fine production of a tour.
Abstract: When a two-dimensional (2D) traveling salesman problem (TSP) is presented on a computer screen, human subjects can produce near-optimal tours in linear time. In this study we tested human performance on a real and virtual floor, as well as in a threedimensional (3D) virtual space. Human performance on the real floor is as good as that on a computer screen. Performance on a virtual floor is very similar, while that in a 3D space is slightly but systematically worse. We modeled these results by a graph pyramid algorithm. The same algorithm can account for the results with 2D and 3D problems, which suggests that deterioration of performance in the 3D space can be attributed to geometrical relations between hierarchical clustering in a 3D space and coarse-to-fine production of a tour.

1,790 citations

Journal ArticleDOI
TL;DR: In this paper, a class of surrogate constraint heuristics are proposed for integer programming problems. But they are based on a simple framework that illuminates the character of several earlier heuristic proposals and provides a variety of new alternatives.
Abstract: This paper proposes a class of surrogate constraint heuristics for obtaining approximate, near optimal solutions to integer programming problems. These heuristics are based on a simple framework that illuminates the character of several earlier heuristic proposals and provides a variety of new alternatives. The paper also proposes additional heuristics that can be used either to supplement the surrogate constraint procedures or to provide independent solution strategies. Preliminary computational results are reported for applying one of these alternatives to a class of nonlinear generalized set covering problems involving approximately 100 constraints and 300–500 integer variables. The solutions obtained by the tested procedure had objective function values twice as good as values obtained by standard approaches (e.g., reducing the best objective function values of other methods from 85 to 40 on the average. Total solution time for the tested procedure ranged from ten to twenty seconds on the CDC 6600.

1,326 citations