scispace - formally typeset
Search or ask a question

Showing papers on "Local search (optimization) published in 1995"


Journal ArticleDOI
TL;DR: In this article, a Bayesian approach for learning Bayesian networks from a combination of prior knowledge and statistical data is presented, which is derived from a set of assumptions made previously as well as the assumption of likelihood equivalence, which says that data should not help to discriminate network structures that represent the same assertions of conditional independence.
Abstract: We describe a Bayesian approach for learning Bayesian networks from a combination of prior knowledge and statistical data. First and foremost, we develop a methodology for assessing informative priors needed for learning. Our approach is derived from a set of assumptions made previously as well as the assumption of likelihood equivalence, which says that data should not help to discriminate network structures that represent the same assertions of conditional independence. We show that likelihood equivalence when combined with previously made assumptions implies that the user's priors for network parameters can be encoded in a single Bayesian network for the next case to be seen—a prior network—and a single measure of confidence for that network. Second, using these priors, we show how to compute the relative posterior probabilities of network structures given data. Third, we describe search methods for identifying network structures with high posterior probabilities. We describe polynomial algorithms for finding the highest-scoring network structures in the special case where every node has at most k e 1 parent. For the general case (k > 1), which is NP-hard, we review heuristic search algorithms including local search, iterative local search, and simulated annealing. Finally, we describe a methodology for evaluating Bayesian-network learning algorithms, and apply this approach to a comparison of various approaches.

4,124 citations


Journal ArticleDOI
TL;DR: This paper defines the various components comprising a GRASP and demonstrates, step by step, how to develop such heuristics for combinatorial optimization problems.
Abstract: Today, a variety of heuristic approaches are available to the operations research practitioner. One methodology that has a strong intuitive appeal, a prominent empirical track record, and is trivial to efficiently implement on parallel processors is GRASP (Greedy Randomized Adaptive Search Procedures). GRASP is an iterative randomized sampling technique in which each iteration provides a solution to the problem at hand. The incumbent solution over all GRASP iterations is kept as the final result. There are two phases within each GRASP iteration: the first intelligently constructs an initial solution via an adaptive randomized greedy function; the second applies a local search procedure to the constructed solution in hope of finding an improvement. In this paper, we define the various components comprising a GRASP and demonstrate, step by step, how to develop such heuristics for combinatorial optimization problems. Intuitive justifications for the observed empirical behavior of the methodology are discussed. The paper concludes with a brief literature review of GRASP implementations and mentions two industrial applications.

2,370 citations


Book ChapterDOI
01 Jan 1995
TL;DR: Computational bounds for local search in combinatorial local search algorithms for Combinatorial problems local searchgorithms for solving the combinatoric a dual local search framework for combinatorials the max-min ant system and local search for combinatorsial a framework for local combinatoria optimization problems localSearch in combinatorship optimization radarx heuristics and localsearch paginas.
Abstract: computational bounds for local search in combinatorial local search algorithms for combinatorial problems local search algorithms for solving the combinatorial a dual local search framework for combinatorial the max-min ant system and local search for combinatorial a framework for local combinatorial optimization problems local search in combinatorial optimization mifou local search in combinatorial optimization radarx heuristics and local search paginas.fe.up local search in combinatorial optimization banani local search in combinatorial optimization integrating interval estimates of global optima and local local search in combinatorial optimization holbarto a fuzzy valuation-based local search framework for local search in combinatorial optimization gbv gentle introduction to local search in combinatorial e cient local search for several combinatorial introduction: combinatorial problems and search on application of the local search optimization online local search for combinatorial optimisation problems how to choose solutions for local search in multiobjective a hybrid of inference and local search for distributed on set-based local search for multiobjective combinatorial localsolver: black-box local search for combinatorial on local optima in multiobjective combinatorial sets of interacting scalarization functions in local introduction e r optimization online estimation-based local search for stochastic combinatorial advanced search combinatorial problems sas/or user's guide: local search optimization how easy is local search? univerzita karlova metaheuristic search for combinatorial optimization local search genetic algorithms for the job shop towards a formalization of combinatorial local search dynamic and adaptive neighborhood search in combinatorial global search in combinatorial optimization using a framework for the development of local search heuristics a feasibility-preserving local search operator for hybrid metaheuristics in combinatorial optimization: a survey consultant-guided search algorithms with local search for a new optimization algorithm for combinatorial problems hybrid metaheuristics in combinatorial optimization: a survey network algorithms colorado state university model-based search for combinatorial optimization: a local and global optimization stochastic local search algorithms for multiobjective corso (collaborative reactive search optimization local search in combinatorial optimization zaraa

1,055 citations


01 Jan 1995
TL;DR: The power of local search for satissability testing can be further enhanced by employing a new strategy, called mixed random walk, for escaping from local minima, which allows us to handle formulas that are substantially larger than those that can be solved with basic local search.
Abstract: It has recently been shown that local search is surprisingly good at nding satisfying assignments for certain classes of CNF formulas 24]. In this paper we demonstrate that the power of local search for satissability testing can be further enhanced by employinga new strategy, called \mixed random walk", for escaping from local minima. We present experimental results showing how this strategy allows us to handle formulas that are substantially larger than those that can be solved with basic local search. We also present a detailed comparison of our random walk strategy with simulated annealing. Our results show that mixed random walk is the superior strategy on several classes of computationally diicult problem instances. Finally, we present results demonstrating the eeectiveness of local search with walk for solving circuit synthesis and diagnosis problems.

674 citations


Book ChapterDOI
09 Jul 1995
TL;DR: Ant-Q algorithms were inspired by work on the ant system (AS), a distributed algorithm for combinatorial optimization based on the metaphor of ant colonies and are applied to the solution of symmetric and asymmetric instances of the traveling salesman problem.
Abstract: In this paper we introduce Ant-Q, a family of algorithms which present many similarities with Q-learning (Watkins, 1989), and which we apply to the solution of symmetric and asymmetric instances of the traveling salesman problem (TSP). Ant-Q algorithms were inspired by work on the ant system (AS), a distributed algorithm for combinatorial optimization based on the metaphor of ant colonies which was recently proposed in (Dorigo, 1992; Dorigo, Maniezzo and Colorni, 1996). We show that AS is a particular instance of the Ant-Q family, and that there are instances of this family which perform better than AS. We experimentally investigate the functioning of Ant-Q and we show that the results obtained by Ant-Q on symmetric TSP's are competitive with those obtained by other heuristic approaches based on neural networks or local search. Finally, we apply Ant-Q to some difficult asymmetric TSP's obtaining very good results: Ant-Q was able to find solutions of a quality which usually can be found only by very specialized algorithms.

668 citations


Journal ArticleDOI
TL;DR: In the present study, genetic algorithms are proposed to automatically configure RBF networks and the network configuration is formed as a subset selection problem to find an optimal subset of nc terms from the Nt training data samples.

242 citations


Journal Article
TL;DR: In this article, the authors compare the syntactically defined class MAX SNP with the computationally defined class APX and show that every problem in APX can be "placed" (i.e., has approximation-preserving reduction to a problem) in MAX SNP.
Abstract: We attempt to reconcile the two distinct views of approximation classes: syntactic and computational. Syntactic classes such as MAX SNP permit structural results and have natural complete problems, while computational classes such as APX allow us to work with classes of problems whose approximability is well understood. Our results provide a syntactic characterization of computational classes and give a computational framework for syntactic classes. We compare the syntactically defined class MAX SNP with the computationally defined class APX and show that every problem in APX can be "placed" (i.e., has approximation-preserving reduction to a problem) in MAX SNP. Our methods introduce a simple, yet general, technique for creating approximation-preserving reductions which shows that any "well"-approximable problem can be reduced in an approximation-preserving manner to a problem which is hard to approximate to corresponding factors. The reduction then follows easily from the recent nonapproximability results for MAX SNP-hard problems. We demonstrate the generality of this technique by applying it to other classes such as MAX SNP-RMAX(2) and MIN F$^{+}\Pi_2(1)$ which have the clique problem and the set cover problem, respectively, as complete problems. The syntactic nature of MAX SNP was used by Papadimitriou and Yannakakis [J. Comput. System Sci., 43 (1991), pp. 425--440] to provide approximation algorithms for every problem in the class. We provide an alternate approach to demonstrating this result using the syntactic nature of MAX SNP. We develop a general paradigm, nonoblivious local search, useful for developing simple yet efficient approximation algorithms. We show that such algorithms can find good approximations for all MAX SNP problems, yielding approximation ratios comparable to the best known for a variety of specific MAX SNP-hard problems. Nonoblivious local search provably outperforms standard local search in both the degree of approximation achieved and the efficiency of the resulting algorithms.

232 citations


Journal ArticleDOI
TL;DR: This paper describes the development and testing of a Genetic Algorithm for the generation of multiple solutions to the assembly line balancing (ALB) problem, and results are achieved by combining the genetic approach with a simple local optimization procedure.

175 citations


Journal ArticleDOI
TL;DR: A survey of parallel local search algorithms is presented in which the concepts that can be used to incorporate parallelism into local search are reviewed and the concepts of hyper neighborhood structures and distributed neighborhood structures are introduced.
Abstract: We present a survey of parallel local search algorithms in which we review the concepts that can be used to incorporate parallelism into local search For this purpose we distinguish between single-walk and multiple-walk parallel local search and between asynchronous and synchronous parallelism Within the class of single-walk algorithms we differentiate between multiple-step and single-step parallelism To describe parallel local search we introduce the concepts of hyper neighborhood structures and distributed neighborhood structures Furthermore, we present templates that capture most of the parallel local search algorithms proposed in the literature Finally, we discuss some complexity issues related to parallel local search

161 citations


Book ChapterDOI
20 Nov 1995
TL;DR: This dissertation developed a distributed steady-state genetic algorithm in conjunction with a specialized local search heuristic for solving the set partitioning problem and found that performance improved as additional subpopulations were added to the computation.
Abstract: In this dissertation we report on our efforts to develop a parallel genetic algorithm and apply it to the solution of the set partitioning problem--a difficult combinatorial optimization problem used by many airlines as a mathematical model for flight crew scheduling. We developed a distributed steady-state genetic algorithm in conjunction with a specialized local search heuristic for solving the set partitioning problem. The genetic algorithm is based on an island model where multiple independent subpopulations each run a steady-state genetic algorithm on their own subpopulation and occasionally fit strings migrate between the subpopulations. Tests on forty real-world set partitioning problems were carried out on up to 128 nodes of an IBM SP1 parallel computer. We found that performance, as measured by the quality of the solution found and the iteration on which it was found, improved as additional subpopulations were added to the computation. With larger numbers of subpopulations the genetic algorithm was regularly able to find the optimal solution to problems having up to a few thousand integer variables. In two cases, high-quality integer feasible solutions were found for problems with 36,699 and 43,749 integer variables, respectively. A notable limitation we found was the difficulty solving problems with many constraints.

155 citations


Journal ArticleDOI
TL;DR: In this article, a local search heuristic (LSH) was proposed for large non-unicost set-covering problems (SCPs) based on the simulated annealing algorithm and uses an improvement routine designed to provide low-cost solutions within a reasonable amount of CPU time.
Abstract: In this note we describe a local-search heuristic (LSH) for large non-unicost set-covering problems (SCPs). The new heuristic is based on the simulated annealing algorithm and uses an improvement routine designed to provide low-cost solutions within a reasonable amount of CPU time. The solution costs associated with the LSH compared very favorably to the best previously published solution costs for 20 large SCPs taken from the literature. In particular, the LSH yielded new benchmark solutions for 17 of the 20 test problems. We also report that, for SCPs where column cost is correlated with column coverage, the new heuristic provides solution costs competitive with previously published results for comparable problems. © 1995 John Wiley & Sons, Inc.

Journal ArticleDOI
TL;DR: From the computational results, it can conclude that the large-step optimization methods outperform the simulated annealing method and find more frequently an optimal schedule than the other studied methods.

Journal ArticleDOI
TL;DR: A cheap yet effective extension to the traditional local improvement algorithm that yields significant improvements over its plain local improvement counterpart without adversely affecting the algorithm's running time is proposed.

Journal ArticleDOI
TL;DR: The task of training subsymbolic systems is considered as a combinatorial optimization problem and solved with the heuristic scheme of the reactive tabu search (RTS), which is applicable to nondifferentiable functions, is robust with respect to the random initialization, and effective in continuing the search after local minima.
Abstract: In this paper the task of training subsymbolic systems is considered as a combinatorial optimization problem and solved with the heuristic scheme of the reactive tabu search (RTS). An iterative optimization process based on a "modified local search" component is complemented with a meta-strategy to realize a discrete dynamical system that discourages limit cycles and the confinement of the search trajectory in a limited portion of the search space. The possible cycles are discouraged by prohibiting (i.e., making tabu) the execution of moves that reverse the ones applied in the most recent part of the search. The prohibition period is adapted in an automated way. The confinement is avoided and a proper exploration is obtained by activating a diversification strategy when too many configurations are repeated excessively often. The RTS method is applicable to nondifferentiable functions, is robust with respect to the random initialization, and effective in continuing the search after local minima. Three tests of the technique on feedforward and feedback systems are presented. >

Journal ArticleDOI
TL;DR: An elitist simple genetic algorithm, the CHC algorithm and Genitor are compared using new test problems that are not readily solved using simple local search methods and a hybrid algorithm is examined that combines local and genetic search.
Abstract: Genetic algorithms have attracted a good deal of interest in the heuristic search community. Yet there are several different types of genetic algorithms with varying performance and search characteristics. In this article we look at three genetic algorithms: an elitist simple genetic algorithm, the CHC algorithm and Genitor. One problem in comparing algorithms is that most test problems in the genetic algorithm literature can be solved using simple local search methods. In this article, the three algorithms are compared using new test problems that are not readily solved using simple local search methods. We then compare a local search method to genetic algorithms for geometric matching and examine a hybrid algorithm that combines local and genetic search. The geometric matching problem matches a model (e.g., a line drawing) to a subset of lines contained in a field of line fragments. Local search is currently the best known method for solving general geometric matching problems.

01 Jan 1995
TL;DR: This paper explores the impact of focusing search in local search procedures on the unsatis ed variables that is those variables which appear in clauses which are not yet satis ed for random problems and shows that such a focus reduces the sensitivity to input parameters.
Abstract: Several local search algorithms for propositional satis ability have been pro posed which can solve hard random problems beyond the range of conventional backtracking procedures In this paper we explore the impact of focusing search in these procedures on the unsatis ed variables that is those variables which appear in clauses which are not yet satis ed For random problems we show that such a focus reduces the sensitivity to input parameters We also observe a simple scaling law in performance For non random problems we show that whilst this focus can improve performance many problems remain di cult We speculate that such problems will remain hard for local search unless constraint propagation techniques can be combined with hill climbing

Proceedings ArticleDOI
29 Nov 1995
TL;DR: By combining a hierarchical crossover operator with two traditional single-point search algorithms (simulated annealing and stochastic iterated hill climbing), this work has solved some problems by processing fewer candidate solutions and with a greater probability of success than genetic programming.
Abstract: Addresses the problem of program discovery as defined by genetic programming. By combining a hierarchical crossover operator with two traditional single-point search algorithms (simulated annealing and stochastic iterated hill climbing), we have solved some problems by processing fewer candidate solutions and with a greater probability of success than genetic programming. We have also enhanced genetic programming by hybridizing it with the simple idea of hill climbing from a few individuals, at a fixed interval of generations.

Journal ArticleDOI
TL;DR: This paper attempts to determined, through computational testing, how these spaces can be successfully searched, and an interesting result is the good performance of genetic algorithms in problem space.
Abstract: In a recent paper we discussed “problem” and “heuristic” spaces which serve as a basis for local search in job shop scheduling problems. By encoding schedules as heuristic, problem pairs (H,P) search spaces can be defined by perturbing problem data and/or heuristic parameters. In this paper we attempt to determined, through computational testing, how these spaces can be successfully searched. Well known local search strategies are applied in problem and heuristic space and compared to Shifting Bottleneck heuristics, and to probabilistic dispatching methods. An interesting result is the good performance of genetic algorithms in problem space. INFORMS Journal on Computing, ISSN 1091-9856, was published as ORSA Journal on Computing from 1989 to 1995 under ISSN 0899-1499.

Journal ArticleDOI
TL;DR: In this article, a version of the Reactive Tabu Search method (RTS) is presented for constrained problems, and that of testing RTS on a series of constrained and unconstrained combinatorial optimization tasks.
Abstract: The purpose of this work is that of presenting a version of the Reactive Tabu Search method (RTS) that is suitable for constrained problems, and that of testing RTS on a series of constrained and unconstrained Combinatorial Optimization tasks. The benchmark suite consists of many instances of the N-K model and of the Multiknapsack problem with various sizes and difficulties, defined with portable random number generators. The performance of RTS is compared with that of Repeated Local Minima Search, Simulated Annealing, Genetic Algorithms, and Neural Networks. In addition, the effects of differenthashing schemes and of the presence of a simple “aspiration” criterion in the RTS algorithm are investigated.

Book ChapterDOI
01 Jan 1995
TL;DR: This work provides a framework that distinguishes between two developmental mechanisms — learning and maturation — while also showing several common effects on GA search, and identifies contexts in which maturation and local search can be distinguished from the fitness evaluation.
Abstract: The developmental mechanisms transforming genotypic to phenotypic forms are typically omitted in formulations of genetic algorithms (GAs) in which these two representational spaces are identical. We argue that a careful analysis of developmental mechanisms is useful when understanding the success of several standard GA techniques, and can clarify the relationships between more recently proposed enhancements. We provide a framework that distinguishes between two developmental mechanisms — learning and maturation — while also showing several common effects on GA search. This framework is used to analyze how maturation and local search can change the dynamics of the GA. We observe that in some contexts, maturation and local search can be incorporated into the fitness evaluation, but illustrate reasons for considering them seperately. Further, we identify contexts in which maturation and local search can be distinguished from the fitness evaluation.

Journal ArticleDOI
01 Mar 1995
TL;DR: This work presents a scalable parallel local search algorithm based on data parallelism for the Traveling Salesman Problem that finds the same quality solutions as the classical 2-opt algorithm and has a good speed-up.
Abstract: We present a scalable parallel local search algorithm based on data parallelism. The concept of distributed neighborhood structures is introduced, and applied to the Traveling Salesman Problem (TSP). Our parallel local search algorithm finds the same quality solutions as the classical 2-opt algorithm and has a good speed-up. The algorithm is implemented on a Parsytec GCel, consisting of 512 transputers. Its performance is empirically analyzed for TSP instances with several thousands of cities.

Journal ArticleDOI
TL;DR: It is proved that, when restricted to cubic graphs, the FLIP local search becomes "easy" and finds a local max-cut in $O(n^2) $ steps, and a class of integer linear programs associated with cubic graphs is introduced, and the combinatorial characterization of their feasibility is provided.
Abstract: The paper deals with the complexity of the local search, a topic introduced by Johnson, Papadimitriou, and Yannakakis. One consequence of their work, and a recent paper by Schaffer and Yannakakis, is that the local search does not provide a polynomial time algorithm to find locally optimum solutions for several hard combinatorial optimization problems. This motivates us to seek "easier" instances for which the local search is polynomial. In particular, it has been proved recently by Schaffer and Yannakakis that the max-cut problem with the FLIP neighborhood is PLS-complete, and hence belongs among the most difficult problems in the PLS-class (polynomial time local search). The FLIP neighborhood of a 2-partition is defined by moving a single vertex to the opposite class. We prove that, when restricted to cubic graphs, the FLIP local search becomes "easy" and finds a local max-cut in $O(n^2) $ steps. To prove the result, we introduce a class of integer linear programs associated with cubic graphs, and provide a combinatorial characterization of their feasibility.

Journal ArticleDOI
TL;DR: Informally, this work is interested in approximate solutions such that the performance ratio, that is the ratio between the value of such solution and thevalue of the optimum solution, is bounded by a constant, independently from the instance of the problem.

Journal ArticleDOI
TL;DR: Three specific algorithms are compared on robot landmark recognition problems and two hybrid algorithms successfully solve problems involving perspective, and in less time than required by the full-perspective algorithm.

Proceedings ArticleDOI
09 May 1995
TL;DR: Two lattice dynamic programming and lattice local search algorithms are shown to achieve comparable performance to the N-best search algorithm while running as much as 10 times faster on a 20 k word lexicon.
Abstract: The design of search algorithms is an important issue in large vocabulary speech recognition, especially as more complex models are developed for improving recognition accuracy. Multi-pass search strategies have been used as a means of applying simple models early on to prune the search space for subsequent passes using more expensive knowledge sources. The pruned search space is typically represented by an N-best sentence list or a word lattice. Here, we investigate three alternatives for lattice search: N-best rescoring, a lattice dynamic programming search algorithm and a lattice local search algorithm. Both the lattice dynamic programming and lattice local search algorithms are shown to achieve comparable performance to the N-best search algorithm while running as much as 10 times faster on a 20 k word lexicon; the local search algorithm has the additional advantage of accommodating sentence-level knowledge sources.

Journal ArticleDOI
TL;DR: The primary advantage of genetic algorithms, viz. the generalized search operators, enables easy combinations of these global search algorithms with local search heuristics to provide an efficient hybrid algorithm for the mapping problem without compromising the solution quality.

Journal ArticleDOI
TL;DR: This paper compares simulated annealing with a less frequently mentioned approach, threshold accepting, and shows that threshold accepting algorithms achieve better or comparable performance with simulatedAnnealing in the standard and adaptive versions.

Proceedings ArticleDOI
20 Mar 1995
TL;DR: The experimental results show that the proposed genetic-based clustering algorithms have much higher probabilities of finding the global or near-global optimal solutions than the traditional algorithms.
Abstract: The traditional fuzzy objective-function-based clustering algorithms, the fuzzy c-means (FCM) algorithm and the FCM-type algorithms, are in essence local search techniques that search for the optimum by using a hill-climbing technique. Thus, they often fail in the search for global optimum. In this paper, we combine the genetic algorithms with traditional clustering algorithms to obtain a better clustering performance. Our experimental results show that the proposed genetic-based clustering algorithms have much higher probabilities of finding the global or near-global optimal solutions than the traditional algorithms. >

Journal ArticleDOI
TL;DR: In this article, the problem of finding a planar graph that can be drawn on a plane without any edges intersecting with the highest sum of edge weights is formulated as a weighted maximal planar problem.

Book ChapterDOI
03 Apr 1995
TL;DR: A set of recombination operators is defined for the new genetic representation, and experimental results are given to compare the performance of the operators with each other and with a system not using suggestion lists.
Abstract: This paper presents a new genetic representation for timetabling with evolutionary algorithms. The representation involves the use of suggestion lists for the placement of events into timeslots. A set of recombination operators is defined for the new representation, and experimental results are given to compare the performance of the operators with each other and with a system not using suggestion lists.