scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 1999"


Journal ArticleDOI
TL;DR: A "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator and is proposed and tested empirically, showing that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.
Abstract: Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, however, on some function optimization problems. In the paper, a "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator. The relationship between FEP and classical EP (CEP) is similar to that between fast simulated annealing and the classical version. Both analytical and empirical studies have been carried out to evaluate the performance of FEP and CEP for different function optimization problems. The paper shows that FEP is very good at search in a large neighborhood while CEP is better at search in a small local neighborhood. For a suite of 23 benchmark problems, FEP performs much better than CEP for multimodal functions with many local minima while being comparable to CEP in performance for unimodal and multimodal functions with only a few local minima. The paper also shows the relationship between the search step size and the probability of finding a global optimum and thus explains why FEP performs better than CEP on some functions but not on others. In addition, the importance of the neighborhood size and its relationship to the probability of finding a near-optimum is investigated. Based on these analyses, an improved FEP (IFEP) is proposed and tested empirically. This technique mixes different search operators (mutations). The experimental results show that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.

3,412 citations


Journal ArticleDOI
TL;DR: Application of the method is illustrated by the restoration of a ribosome-like model structure and more realistically by the determination of the shape of several proteins from experimental x-ray scattering data.

2,105 citations


Book
01 Oct 1999
TL;DR: The techniques treated in this text represent research as elucidated by the leaders in the field and are applied to real problems, such as hilllclimbing, simulated annealing, and tabu search.
Abstract: Optimization is a pivotal aspect of software design. The techniques treated in this text represent research as elucidated by the leaders in the field. The optimization methods are applied to real problems, such as hilllclimbing, simulated annealing, and tabu search.

1,461 citations


Journal ArticleDOI
TL;DR: In this article, a derivative-free search method for finding models of acceptable data fit in a multidimensional parameter space is presented, which falls into the same class of method as simulated annealing and genetic algorithms, which are commonly used for global optimization problems.
Abstract: SUMMARY This paper presents a new derivative-free search method for finding models of acceptable data fit in a multidimensional parameter space. It falls into the same class of method as simulated annealing and genetic algorithms, which are commonly used for global optimization problems. The objective here is to find an ensemble of models that preferentially sample the good data-fitting regions of parameter space, rather than seeking a single optimal model. (A related paper deals with the quantitative appraisal of the ensemble.) The new search algorithm makes use of the geometrical constructs known as Voronoi cells to derive the search in parameter space. These are nearest neighbour regions defined under a suitable distance norm. The algorithm is conceptually simple, requires just two ‘tuning parameters’, and makes use of only the rank of a data fit criterion rather than the numerical value. In this way all diYculties associated with the scaling of a data misfit function are avoided, and any combination of data fit criteria can be used. It is also shown how Voronoi cells can be used to enhance any existing direct search algorithm, by intermittently replacing the forward modelling calculations with nearest neighbour calculations. The new direct search algorithm is illustrated with an application to a synthetic problem involving the inversion of receiver functions for crustal seismic structure. This is known to be a non-linear problem, where linearized inversion techniques suVer from a strong dependence on the starting solution. It is shown that the new algorithm produces a sophisticated type of ‘self-adaptive’ search behaviour, which to our knowledge has not been demonstrated in any previous technique of this kind.

1,336 citations


Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo direct search method is used to estimate the information in the available ensemble to guide a resampling of the parameters of the model space, which can be used to obtain measures of resolution and trade-off in the model parameters.
Abstract: SUMMARY Monte Carlo direct search methods, such as genetic algorithms, simulated annealing etc., areoften used to explore a finite dimensional parameter space. They require th e solving of theforward problem many times, that is, making predictions of observables from an earth model.The resulting ensemble of earth models represents all ‘information’ collected in the searchprocess. Search techniques have been the subject of much study in geophysics; less attention isgiven to the appraisal of the ensemble. Often inferences are based on only a small subset of theensemble, and sometimes a single member.This paper presents a new approach to the appraisal problem. To our knowledge this is the firsttime the general case has been addressed, that is, how to infer information from a completeensemble, previously generated by any search method. The essence of the new approach is touse the informationin the available ensembleto guidea resamplingofthe parameterspace. Thisrequires no further solving of the forward problem, but from the new ‘resampled’ ensemblewe are able to obtain measures of resolution and trade-off in the model parameters, or anycombinations of them.The new ensemble inference algorithm is illustrated on a highly non-linear waveform inversionproblem.It is shownhow the computationtime and memoryrequirements scale with the dimen-sion of the parameter space and size of the ensemble. The method is highly parallel, and mayeasily be distributed across several computers. Since little is assumed about the initial ensembleof earth models, the technique is applicable to a wide variety of situations. For example, it maybe applied to perform ‘error analysis’ using the ensemble generated by a genetic algorithm, orany other direct search method.Key words: numerical techniques, receiver functions, waveform inversion.

817 citations


Proceedings ArticleDOI
01 Jan 1999
TL;DR: An efficient shape-based object detection method based on Distance Transforms is presented and its use for real-time vision on-board vehicles and some hardware-specific implementations of the proposed method as far as SIMD parallelism is concerned are discussed.
Abstract: This paper presents an efficient shape-based object detection method based on Distance Transforms and describes its use for real-time vision on-board vehicles. The method uses a template hierarchy to capture the variety of object shapes; efficient hierarchies can be generated offline for given shape distributions using stochastic optimization techniques (i.e. simulated annealing). Online, matching involves a simultaneous coarse-to-fine approach over the shape hierarchy and over the transformation parameters. Very large speed-up factors are typically obtained when comparing this approach with the equivalent brute-force formulation; we have measured gains of several orders of magnitudes. We present experimental results on the real-time detection of traffic signs and pedestrians from a moving vehicle. Because of the highly time sensitive nature of these vision tasks, we also discuss some hardware-specific implementations of the proposed method as far as SIMD parallelism is concerned.

758 citations


Journal ArticleDOI
TL;DR: In this article, a hybrid ant colony system coupled with a local search is applied to the quadratic assignment problem, which uses pheromone trail information to perform modifications on QAP solutions.
Abstract: This paper presents HAS–QAP, a hybrid ant colony system coupled with a local search, applied to the quadratic assignment problem. HAS–QAP uses pheromone trail information to perform modifications on QAP solutions, unlike more traditional ant systems that use pheromone trail information to construct complete solutions. HAS–QAP is analysed and compared with some of the best heuristics available for the QAP: two versions of tabu search, namely, robust and reactive tabu search, hybrid genetic algorithm, and a simulated annealing method. Experimental results show that HAS–QAP and the hybrid genetic algorithm perform best on real world, irregular and structured problems due to their ability to find the structure of good solutions, while HAS–QAP performance is less competitive on random, regular and unstructured problems.

710 citations


Journal ArticleDOI
30 Apr 1999-Science
TL;DR: Thermal and quantum annealing are compared in a model disordered magnet, where the effects of quantum mechanics can be tuned by varying an applied magnetic field, and the results indicate that quantumAnnealing hastens convergence to the optimum state.
Abstract: Traditional simulated annealing uses thermal fluctuations for convergence in optimization problems. Quantum tunneling provides a different mechanism for moving between states, with the potential for reduced time scales. Thermal and quantum annealing are compared in a model disordered magnet, where the effects of quantum mechanics can be tuned by varying an applied magnetic field. The results indicate that quantum annealing hastens convergence to the optimum state.

486 citations


Journal ArticleDOI
TL;DR: In this article, a simulated annealing-based heuristic was developed to obtain the least-cost design of a looped water distribution network, where a Newton search method was used to solve the hydraulic network equations.
Abstract: A simulated annealing-based heuristic has been developed to obtain the least-cost design of a looped water distribution network. A Newton search method was used to solve the hydraulic network equations. Simulated annealing is a stochastic optimization method that can work well for large-scale optimization problems that are cast in discrete or combinatorial form, as with the problem proposed. The results obtained with this approach for networks currently appearing in the literature as case studies in this field (whose solution by other optimization methods was known) have proved the ability of the heuristic to handle this kind of problem.

439 citations


Book ChapterDOI
01 Jan 1999
TL;DR: A recently proposed metaheuristic, the Ant System, is used to solve the Vehicle Routing Problem in its basic form, i.e., with capacity and distance restrictions, one central depot and identical vehicles.
Abstract: In this paper we use a recently proposed metaheuristic, the Ant System, to solve the Vehicle Routing Problem in its basic form, i.e., with capacity and distance restrictions, one central depot and identical vehicles. A “hybrid” Ant System algorithm is first presented and then improved using problem-specific information (savings, capacity utilization). Experiments on various aspects of the algorithm and computational results for fourteen benchmark problems are reported and compared to those of other metaheuristic approaches such as Tabu Search, Simulated Annealing and Neural Networks.

432 citations


Journal ArticleDOI
TL;DR: Comparisons are made with some of the best TSP heuristic algorithms and general optimization techniques which demonstrate the advantages of GLS over alternative heuristic approaches suggested for the problem.

Journal ArticleDOI
TL;DR: In this paper, the authors developed the so-called MOSA (Multiobjective Simulated Annealing) method to approximate the set of efficient solutions of a MOCO problem.
Abstract: The success of modern heuristics (Simulated Annealing (S.A.), Tabu Search, Genetic Algorithms, …) in solving classical combinatorial optimization problems has drawn the attention of the research community in multicriteria methods. In fact, for large-scale problems, the simultaneous difficulties of -hard complexity and of multiobjective framework make most Multiobjective Combinatorial Optimization (MOCO) problems intractable for exact methods. This paper develops the so-called MOSA (Multiobjective Simulated Annealing) method to approximate the set of efficient solutions of a MOCO problem. Different options for the implementation are illustrated and extensive experiments prove the efficiency of the approach. Its results are compared to exact methods on bi-objective knapsack problems. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: An efficient approach for solving capacitated single allocationhub location problems using a modified version of a previous mixed integer linearprogramming formulation developed by us for p‐hub median problems, with fewer variables and constraints than those traditionally used in the literature.
Abstract: In this paper, we present an efficient approach for solving capacitated single allocationhub location problems We use a modified version of a previous mixed integer linearprogramming formulation developed by us for p‐hub median problems This formulationrequires fewer variables and constraints than those traditionally used in the literature Wedevelop good heuristic algorithms for its solution based on simulated annealing (SA) andrandom descent (RDH) We use the upper bound to develop an LP‐based branch and boundsolution method The problem, as we define it, finds applications in the design of postaldelivery networks, particularly in the location of capacitated mail sorting and distributioncentres We test our algorithms on data obtained from this application To the best of ourknowledge, this problem has not been solved in the literature Computational results arepresented indicating the usefulness of our approach

Posted Content
TL;DR: A simulated annealing approach to the solution of a complex portfolio selection model that arises when Markowitz’ classical mean–variance model is enriched with additional realistic constraints is described.
Abstract: This paper describes the application of a simulated annealing approach to the solution of a complex portfolio selection model. The model is a mixed integer quadratic programming problem which arises when Markowitz' classical mean-variance model is enriched with additional realistic constraints. Exact optimization algorithms run into difficulties in this framework and this motivates the investigation of heuristic techniques. Computational experiments indicate that the approach is promising for this class of problems.

Journal ArticleDOI
TL;DR: In this paper, a new algorithm based on integrating genetic algorithms, tabu search and simulated annealing methods to solve the unit commitment problem is presented, which is coded as a mix between binary and decimal representation.
Abstract: This paper presents a new algorithm based on integrating genetic algorithms, tabu search and simulated annealing methods to solve the unit commitment problem. The core of the proposed algorithm is based on genetic algorithms. Tabu search is used to generate new population members in the reproduction phase of the genetic algorithm. A simulated annealing method is used to accelerate the convergence of the genetic algorithm by applying the simulated annealing test for all the population members. A new implementation of the genetic algorithm is introduced. The genetic algorithm solution is coded as a mix between binary and decimal representation. The fitness function is constructed from the total operating cost of the generating units without penalty terms. In the tabu search part of the proposed algorithm, a simple short-term memory procedure is used to counter the danger of entrapment at a local optimum, and the premature convergence of the genetic algorithm. A simple cooling schedule has been implemented to apply the simulated annealing test in the algorithm. Numerical results showed the superiority of the solutions obtained compared to genetic algorithms, tabu search and simulated annealing methods, and to two exact algorithms.

Proceedings ArticleDOI
12 Apr 1999
TL;DR: A collection of eleven heuristics from the literature has been selected, implemented, and analyzed under one set of common assumptions and provides one even basis for comparison and insights into circumstances where one technique will outperform another.
Abstract: Heterogeneous computing (HC) environments are well suited to meet the computational demands of large, diverse groups of tasks (i.e., a meta-task). The problem of mapping (defined as matching and scheduling) these tasks onto the machines of an HC environment has been shown, in general, to be NP-complete, requiring the development of heuristic techniques. Selecting the best heuristic to use in a given environment, however, remains a difficult problem, because comparisons are often clouded by different underlying assumptions in the original studies of each heuristic. Therefore, a collection of eleven heuristics from the literature has been selected, implemented, and analyzed under one set of common assumptions. The eleven heuristics examined are opportunistic load balancing, user-directed assignment, fast greedy, min-min, max-min, greedy, genetic algorithm, simulated annealing, genetic simulated annealing, tabu, and A*. This study provides one even basis for comparison and insights into circumstances where one technique will outperform another. The evaluation procedure is specified, the heuristics are defined, and then selected results are compared.

Journal ArticleDOI
TL;DR: This paper examines two well known global search techniques, Simulated Annealing and the Genetic Algorithm, and compares their performance, and a Monte Carlo study was conducted in order to test the appropriateness of theseglobal search techniques for optimizing neural networks.

Journal ArticleDOI
TL;DR: A modification of the simulated annealing algorithm designed for solving discrete stochastic optimization problems that uses a constant (rather than decreasing) temperature for estimating the optimal solution and shows that both variants of the method are guaranteed to converge almost surely to the set of global optimal solutions.
Abstract: We present a modification of the simulated annealing algorithm designed for solving discrete stochastic optimization problems. Like the original simulated annealing algorithm, our method has the hill climbing feature, so it can find global optimal solutions to discrete stochastic optimization problems with many local solutions. However, our method differs from the original simulated annealing algorithm in that it uses a constant (rather than decreasing) temperature. We consider two approaches for estimating the optimal solution. The first approach uses the number of visits the algorithm makes to the different states (divided by a normalizer) to estimate the optimal solution. The second approach uses the state that has the best average estimated objective function value as estimate of the optimal solution. We show that both variants of our method are guaranteed to converge almost surely to the set of global optimal solutions, and discuss how our work applies in the discrete deterministic optimization setting. We also show how both variants can be applied for solving discrete optimization problems when the objective function values are estimated using either transient or steady-state simulation. Finally, we include some encouraging numerical results documenting the behavior of the two variants of our algorithm when applied for solving two versions of a particular discrete stochastic optimization problem, and compare their performance with that of other variants of the simulated annealing algorithm designed for solving discrete stochastic optimization problems.

Journal ArticleDOI
TL;DR: A new approach is presented that uses a small population of SA runs in a genetic algorithm (GA) framework and yields excellent results on the classical test examples of the JSP.

Journal ArticleDOI
TL;DR: A synthesis method is proposed that is aimed at designing an aperiodic sparse two-dimensional array to be used with a conventional beam-former, and can design very large arrays, optimize both positions and weight coefficients, synthesize asymmetric arrays, and generate array configurations that are valid for every steering direction.
Abstract: Two-dimensional arrays offer the potential for producing three-dimensional acoustic imaging. The major problem is the complexity arising from the large number of elements in such arrays. In this paper, a synthesis method is proposed that is aimed at designing an aperiodic sparse two-dimensional array to be used with a conventional beam-former. The stochastic algorithm of simulated annealing has been utilized to minimize the number of elements necessary to produce a spatial response that meets given requirements. The proposed method is highly innovative, as it can design very large arrays, optimize both positions and weight coefficients, synthesize asymmetric arrays, and generate array configurations that are valid for every steering direction. Several results are presented, showing notable improvements in the array characteristics and performances over those reported in the literature.

Proceedings ArticleDOI
01 Jul 1999
TL;DR: Probabilistic crowding as discussed by the authors maintains subpopulations reliably, and analyzes and predicts how this maintenance takes place, and is a member of a family of tournament algorithms called integrated tournament algorithms, which also include deterministic crowding, restricted tournament selection, elitist recombination, parallel recombinative simulated annealing, the Metropolis algorithm, and simulated anealing.
Abstract: This paper presents a novel niching algorithm, probabilistic crowding. Like its predecessor deterministic crowding, probabilistic crowding is fast, simple, and requires no parameters beyond that of the classical GA. In probabilistic crowding, subpopulations are maintained reliably, and we analyze and predict how this maintenance takes place. This paper also identifies probabilistic crowding as a member of a family of algorithms, which we call integrated tournament algorithms. Integrated tournament algorithms also include deterministic crowding, restricted tournament selection, elitist recombination, parallel recombinative simulated annealing, the Metropolis algorithm, and simulated annealing.

Journal ArticleDOI
TL;DR: Simulated annealing has been utilized to synthesize the positions and the weight coefficients of the elements of a linear array in order to minimize the peak of the sidelobes and to obtain a beam pattern that meets given requirements.
Abstract: In conventional beamforming systems, the use of aperiodic arrays is a powerful way to obtain high resolution employing few elements and avoiding the presence of grating lobes. The optimized design of such arrays is a required task in order to control the side-lobe level and distribution. In this paper, an optimization method aimed at designing aperiodic linear sparse arrays with great flexibility is proposed. Simulated annealing, which is a stochastic optimization methodology, has been utilized to synthesize the positions and the weight coefficients of the elements of a linear array in order to minimize the peak of the sidelobes and to obtain a beam pattern that meets given requirements. An important novelty is the fact that the latter goal can be achieved in parallel to the minimization of both the number of elements and the spatial aperture, resulting in a "global" optimization of the array characteristics. The great freedom that simulated annealing allows in defining the energy function to be minimized is the main reason for the notable versatility and the good results of the proposed method. Such results show an improvement in the array characteristics and performances over those reported in the literature.

Journal ArticleDOI
TL;DR: In this paper, a stochastic technique for the global optimization of complex potential energy surfaces (PES) is proposed, which avoids the freezing problem of simulated annealing by allowing the dynamical process to tunnel energetically inaccessible regions of the PES by way of a dynamically adjusted nonlinear transformation of the original PES.
Abstract: We investigate a novel stochastic technique for the global optimization of complex potential energy surfaces (PES) that avoids the freezing problem of simulated annealing by allowing the dynamical process to tunnel energetically inaccessible regions of the PES by way of a dynamically adjusted nonlinear transformation of the original PES. We demonstrate the success of this approach, which is characterized by a single adjustable parameter, for three generic hard minimization problems.

Journal ArticleDOI
TL;DR: Comparing the performance of four different parameter-search methods on several single-neuron models demonstrates that genetic algorithms and simulated annealing are generally the most effective methods.
Abstract: One of the most difficult and time-consuming aspects of building compartmental models of single neurons is assigning values to free parameters to make models match experimental data. Automated parameter-search methods potentially represent a more rapid and less labor-intensive alternative to choosing parameters manually. Here we compare the performance of four different parameter-search methods on several single-neuron models. The methods compared are conjugate-gradient descent, genetic algorithms, simulated annealing, and stochastic search. Each method has been tested on five different neuronal models ranging from simple models with between 3 and 15 parameters to a realistic pyramidal cell model with 23 parameters. The results demonstrate that genetic algorithms and simulated annealing are generally the most effective methods. Simulated annealing was overwhelmingly the most effective method for simple models with small numbers of parameters, but the genetic algorithm method was equally effective for more complex models with larger numbers of parameters. The discussion considers possible explanations for these results and makes several specific recommendations for the use of parameter searches on neuronal models.

Proceedings Article
13 Jul 1999
TL;DR: The Extremal Optimization method as mentioned in this paper is a general-purpose method for finding high-quality solutions to hard optimization problems, inspired by self-organized critical models of co-evolution such as the Bak-Sneppen model.
Abstract: We describe a general-purpose method for finding high-quality solutions to hard optimization problems, inspired by self-organized critical models of co-evolution such as the Bak-Sneppen model. The method, called Extremal Optimization, successively eliminates extremely undesirable components of sub-optimal solutions, rather than "breeding" better components. In contrast to Genetic Algorithms which operate on an entire "gene-pool" of possible solutions, Extremal Optimization improves on a single candidate solution by treating each of its components as species co-evolving according to Darwinian principles. Unlike Simulated Annealing, its non-equilibrium approach effects an algorithm requiring few parameters to tune. With only one adjustable parameter, its performance proves competitive with, and often superior to, more elaborate stochastic optimization procedures. We demonstrate it here on two classic hard optimization problems: graph partitioning and the traveling salesman problem.

Journal ArticleDOI
TL;DR: In this paper, the authors compared the performance of two probabilistic global optimization methods: the shuffled complex evolution algorithm SCE-UA, and the three-phase simulated annealing algorithm SA-SX.
Abstract: Automatic optimization algorithms are used routinely to calibrate conceptual rainfall-runoff (CRR) models. The goal of calibration is to estimate a feasible and unique (global) set of parameter estimates that best fit the observed runoff data. Most if not all optimization algorithms have difficulty in locating the global optimum because of response surfaces that contain multiple local optima with regions of attraction of differing size, discontinuities, and long ridges and valleys. Extensive research has been undertaken to develop efficient and robust global optimization algorithms over the last 10 years. This study compares the performance of two probabilistic global optimization methods: the shuffled complex evolution algorithm SCE-UA, and the three-phase simulated annealing algorithm SA-SX. Both algorithms are used to calibrate two parameter sets of a modified version of Boughton's [1984] SFB model using data from two Australian catchments that have low and high runoff yields. For the reduced, well-identified parameter set the algorithms have a similar efficiency for the low-yielding catchment, but SCE-UA is almost twice as robust. Although the robustness of the algorithms is similar for the high-yielding catchment, SCE-UA is six times more efficient than SA-SX. When fitting the full parameter set the performance of SA-SX deteriorated markedly for both catchments. These results indicated that SCE-UA's use of multiple complexes and shuffling provided a more effective search of the parameter space than SA-SX's single simplex with stochastic step acceptance criterion, especially when the level of parameterization is increased. Examination of the response surface for the low-yielding catchment revealed some reasons why SCE-UA outperformed SA-SX and why probabilistic optimization algorithms can experience difficulty in locating the global optimum.

Journal ArticleDOI
TL;DR: An Artificial Neural Network is used to construct an approximate model using a database containing Navier-Stokes solutions for all previous designs, which results in a considerable speed-up of the design process by reducing both the interventions of the operator and the computational effort.
Abstract: This paper describes a knowledge-based method for the automatic design of more efficient turbine blades. An Artificial Neural Network (ANN) is used to construct an approximate model (response surface) using a database containing Navier-Stokes solutions for all previous designs. This approximate model is used for the optimization, by means of Simulated Annealing (SA), of the blade geometry, which is then analyzed by a Navier-Stokes solver. This procedure results in a considerable speed-up of the design process by reducing both the interventions of the operator and the computational effort. It is also shown how such a method allows the design of more efficient blades while satisfying both the aerodynamic and mechanical constraints. The method has been applied to different types of two-dimensional turbine blades, of which three examples are presented in this paper.

Journal ArticleDOI
TL;DR: Three applications, maximum likelihood (ML) joint channel and data estimation, infinite-impulse-response (IIR) filter design and evaluation of minimum symbol-error-rate (MSER) decision feedback equalizer (DFE) are used to demonstrate the effectiveness of the ASA.


Journal ArticleDOI
TL;DR: A detailed analysis of various temperature schedules is attempted, and examples will be given of when it is both practically and theoretically justified to use boiling, fixed temperature, or even fast cooling schedules which have a small probability of reaching global minima.
Abstract: A sizable part of the theoretical literature on simulated annealing deals with a property called convergence, which asserts that the simulated annealing chain is in the set of global minimum states of the objective function with probability tending to 1. However, in practice, the convergent algorithms are considered too slow, whereas a number of nonconvergent ones are usually preferred. We attempt a detailed analysis of various temperature schedules. Examples will be given of when it is both practically and theoretically justified to use boiling, fixed temperature, or even fast cooling schedules which have a small probability of reaching global minima. Applications to traveling salesman problems of various sizes are also given.