scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Variable neighborhood search

01 Nov 1997-Computers & Operations Research (Springer, Boston, MA)-Vol. 24, Iss: 11, pp 1097-1100
TL;DR: This chapter presents the basic schemes of VNS and some of its extensions, and presents five families of applications in which VNS has proven to be very successful.
About: This article is published in Computers & Operations Research.The article was published on 1997-11-01. It has received 3572 citations till now. The article focuses on the topics: Variable neighborhood search & Local optimum.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, a simple and effective metaheuristic for combinatorial and global optimization, called variable neighborhood search (VNS), is presented, which can easily be implemented using any local search algorithm as a subroutine.

1,824 citations

Journal ArticleDOI
TL;DR: This paper presents a heuristic for the pickup and delivery problem based on an extension of the large neighborhood search heuristic previously suggested for solving the vehicle routing problem with time windows that is very robust and is able to adapt to various instance characteristics.
Abstract: The pickup and delivery problem with time windows is the problem of serving a number of transportation requests using a limited amount of vehicles. Each request involves moving a number of goods from a pickup location to a delivery location. Our task is to construct routes that visit all locations such that corresponding pickups and deliveries are placed on the same route, and such that a pickup is performed before the corresponding delivery. The routes must also satisfy time window and capacity constraints. This paper presents a heuristic for the problem based on an extension of the large neighborhood search heuristic previously suggested for solving the vehicle routing problem with time windows. The proposed heuristic is composed of a number of competing subheuristics that are used with a frequency corresponding to their historic performance. This general framework is denoted adaptive large neighborhood search. The heuristic is tested on more than 350 benchmark instances with up to 500 requests. It is able to improve the best known solutions from the literature for more than 50% of the problems. The computational experiments indicate that it is advantageous to use several competing subheuristics instead of just one. We believe that the proposed heuristic is very robust and is able to adapt to various instance characteristics.

1,685 citations


Cites methods from "Variable neighborhood search"

  • ...The related Variable Neighborhood Search (VNS) was proposed by Mladenovi´ c and Hansen [ 14 ]....

    [...]

Journal ArticleDOI
TL;DR: The components and concepts that are used in various metaheuristics are outlined in order to analyze their similarities and differences and the classification adopted in this paper differentiates between single solution based metaheURistics and population based meta heuristics.

1,343 citations


Cites methods from "Variable neighborhood search"

  • ...Variable Neighborhood Search (VNS) is a metaheuristic proposed by Hansen and Mladenovic [184,186]....

    [...]

  • ...Variable neighborhood search (VNS) is a metaheuristic proposed by Hansen and Mladenovic [184, 186]....

    [...]

Journal ArticleDOI
TL;DR: A critical discussion of the scientific literature on hyper-heuristics including their origin and intellectual roots, a detailed account of the main types of approaches, and an overview of some related areas are presented.
Abstract: Hyper-heuristics comprise a set of approaches that are motivated (at least in part) by the goal of automating the design of heuristic methods to solve hard computational search problems. An underlying strategic research challenge is to develop more generally applicable search methodologies. The term hyper-heuristic is relatively new; it was first used in 2000 to describe heuristics to choose heuristics in the context of combinatorial optimisation. However, the idea of automating the design of heuristics is not new; it can be traced back to the 1960s. The definition of hyper-heuristics has been recently extended to refer to a search method or learning mechanism for selecting or generating heuristics to solve computational search problems. Two main hyper-heuristic categories can be considered: heuristic selection and heuristic generation. The distinguishing feature of hyper-heuristics is that they operate on a search space of heuristics (or heuristic components) rather than directly on the search space of solutions to the underlying problem that is being addressed. This paper presents a critical discussion of the scientific literature on hyper-heuristics including their origin and intellectual roots, a detailed account of the main types of approaches, and an overview of some related areas. Current research trends and directions for future research are also discussed.

1,023 citations


Cites background from "Variable neighborhood search"

  • ...VNS systematically switches neighbourhoods in a predefined sequence so that the search can explore increasingly distant neighbourhoods of the current solution....

    [...]

  • ...Therefore, we can say that VNS is a highlevel heuristic that coordinates the behaviour of several neighbourhood structures....

    [...]

  • ...Variable neighbourhood search: Although generally not including an adaptive mechanism, Variable Neighbourhood search (VNS) (Mladenovic and Hansen, 1997) is related to heuristic selection based on perturbative heuristics in that such a method exploits the search power of multiple neighbourhoods....

    [...]

Book ChapterDOI
01 Jan 2003
TL;DR: Iterated Local Search (ILS) as mentioned in this paper is a general purpose metaheuristic for finding good solutions of combinatorial optimization problems, which is based on building a sequence of (locally optimal) solutions by perturbing the current solution and applying local search to that modified solution.
Abstract: This is a survey of "Iterated Local Search", a general purpose metaheuristic for finding good solutions of combinatorial optimization problems. It is based on building a sequence of (locally optimal) solutions by: (1) perturbing the current solution; (2) applying local search to that modified solution. At a high level, the method is simple, yet it allows for a detailed use of problem-specific properties. After giving a general framework, we cover the uses of Iterated Local Search on a number of well studied problems.

969 citations

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Book
01 Jan 1979
TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Abstract: This is the second edition of a quarterly column the purpose of which is to provide a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NP-Completeness,’’ W. H. Freeman & Co., San Francisco, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed. Readers having results they would like mentioned (NP-hardness, PSPACE-hardness, polynomial-time-solvability, etc.), or open problems they would like publicized, should send them to David S. Johnson, Room 2C355, Bell Laboratories, Murray Hill, NJ 07974, including details, or at least sketches, of any new proofs (full papers are preferred). In the case of unpublished results, please state explicitly that you would like the results mentioned in the column. Comments and corrections are also welcome. For more details on the nature of the column and the form of desired submissions, see the December 1981 issue of this journal.

40,020 citations

01 Jan 1967
TL;DR: The k-means algorithm as mentioned in this paper partitions an N-dimensional population into k sets on the basis of a sample, which is a generalization of the ordinary sample mean, and it is shown to give partitions which are reasonably efficient in the sense of within-class variance.
Abstract: The main purpose of this paper is to describe a process for partitioning an N-dimensional population into k sets on the basis of a sample. The process, which is called 'k-means,' appears to give partitions which are reasonably efficient in the sense of within-class variance. That is, if p is the probability mass function for the population, S = {S1, S2, * *, Sk} is a partition of EN, and ui, i = 1, 2, * , k, is the conditional mean of p over the set Si, then W2(S) = ff=ISi f z u42 dp(z) tends to be low for the partitions S generated by the method. We say 'tends to be low,' primarily because of intuitive considerations, corroborated to some extent by mathematical analysis and practical computational experience. Also, the k-means procedure is easily programmed and is computationally economical, so that it is feasible to process very large samples on a digital computer. Possible applications include methods for similarity grouping, nonlinear prediction, approximating multivariate distributions, and nonparametric tests for independence among several variables. In addition to suggesting practical classification methods, the study of k-means has proved to be theoretically interesting. The k-means concept represents a generalization of the ordinary sample mean, and one is naturally led to study the pertinent asymptotic behavior, the object being to establish some sort of law of large numbers for the k-means. This problem is sufficiently interesting, in fact, for us to devote a good portion of this paper to it. The k-means are defined in section 2.1, and the main results which have been obtained on the asymptotic behavior are given there. The rest of section 2 is devoted to the proofs of these results. Section 3 describes several specific possible applications, and reports some preliminary results from computer experiments conducted to explore the possibilities inherent in the k-means idea. The extension to general metric spaces is indicated briefly in section 4. The original point of departure for the work described here was a series of problems in optimal classification (MacQueen [9]) which represented special

24,320 citations

Book
31 Jul 1981
TL;DR: Books, as a source that may involve the facts, opinion, literature, religion, and many others are the great friends to join with, becomes what you need to get.
Abstract: New updated! The latest book from a very famous author finally comes out. Book of pattern recognition with fuzzy objective function algorithms, as an amazing reference becomes what you need to get. What's for is this book? Are you still thinking for what the book is? Well, this is what you probably will get. You should have made proper choices for your better life. Book, as a source that may involve the facts, opinion, literature, religion, and many others are the great friends to join with.

15,662 citations


"Variable neighborhood search" refers methods in this paper

  • ...[23] R.L. Canon, J.C. Bezdek and J.V. Dave (1986) Efficient implementation of the fuzzy C-means clustering algorithm....

    [...]

  • ...[13] J.C. Bezdek (1981) Pattern Recognition with Fuzzy Objective Function Algorithm....

    [...]

  • ...[42] J.C. Dunn (1974) A fuzzy relative of the ISODATA process and its use in detecting compact well-separated clusters....

    [...]