scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 1987"


Book
30 Jun 1987
TL;DR: Performance of the simulated annealing algorithm and the relation with statistical physics and asymptotic convergence results are presented.
Abstract: 1 Introduction.- 2 Simulated annealing.- 3 Asymptotic convergence results.- 4 The relation with statistical physics.- 5 Towards implementing the algorithm.- 6 Performance of the simulated annealing algorithm.- 7 Applications.- 8 Some miscellaneous topics.- 9 Summary and conclusions.

3,645 citations


Journal ArticleDOI
TL;DR: A new global optimization algorithm for functions of continuous variables is presented, derived from the “Simulated Annealing” algorithm recently introduced in combinatorial optimization, which is quite costly in terms of function evaluations, but its cost can be predicted in advance, depending only slightly on the starting point.
Abstract: A new global optimization algorithm for functions of continuous variables is presented, derived from the “Simulated Annealing” algorithm recently introduced in combinatorial optimization.The algorithm is essentially an iterative random search procedure with adaptive moves along the coordinate directions. It permits uphill moves under the control of a probabilistic criterion, thus tending to avoid the first local minima encountered.The algorithm has been tested against the Nelder and Mead simplex method and against a version of Adaptive Random Search. The test functions were Rosenbrock valleys and multiminima functions in 2,4, and 10 dimensions.The new method proved to be more reliable than the others, being always able to find the optimum, or at least a point very close to it. It is quite costly in term of function evaluations, but its cost can be predicted in advance, depending only slightly on the starting point.

1,598 citations


Book
01 Jan 1987
TL;DR: A detergent composition mainly for automatic laundering machines which comprises, on the basis of 100 parts by weight of total composition, at least 60 parts of soap and no more than 10 parts of a mixture of surfactants which impart an excellent detergent ability and foam control even in very soft waters and non-polluting properties.
Abstract: A detergent composition mainly for automatic laundering machines which comprises, on the basis of 100 parts by weight of total composition, at least 60 parts of soap and no more than 10 parts of a mixture of surfactants comprising 10 to 30% of at least one non-ionic polyoxyalkylated surfactant and 90 to 70% of an anionic surfactant selected essentially from alpha -sulfonated fatty acids derivatives, the remainder of the composition comprising at least one ingredient selected from alkaline detergent additives, bleaching agents, optical brighteners, fragrances, antiredeposition agents and enzymes. The non-ionic surfactants are preferably fatty acid amides derived from tallow, copra or palm-oil condensed with polyoxyethylene residues. The anionic surfactants are preferably alpha -sulfonated fatty esters or amides derived from tallow, copra or palm-oil. The proper combination of said non-ionic and anionic surfactants with soaps impart to the laundering compositions an excellent detergent ability and foam control even in very soft waters and non-polluting properties.

1,406 citations


Journal ArticleDOI
TL;DR: In this article, a fast simulated annealing (FSA) algorithm is proposed, which is a semi-local search and consists of occasional long jumps, and the cooling schedule of the FSA algorithm is inversely linear in time.

983 citations


Journal ArticleDOI
TL;DR: It is shown that tabu search techniques provide almost optimal colorings of graphs having up to 1000 nodes and their efficiency is shown to be significantly superior to the famous simulated annealing.
Abstract: Tabu search techniques are used for moving step by step towards the minimum value of a function. A tabu list of forbidden movements is updated during the iterations to avoid cycling and being trapped in local minima. Such techniques are adapted to graph coloring problems. We show that they provide almost optimal colorings of graphs having up to 1000 nodes and their efficiency is shown to be significantly superior to the famous simulated annealing.

654 citations


Journal ArticleDOI
TL;DR: An application of the simulated annealing method to solve the quadratic assignment problem (QAP) is presented, which uses Monte Carlo sampling to occasionally accept solutions to discrete optimization problems which increase rather than decrease the objective function value.
Abstract: Recently, an interesting analogy between problems in combinatorial optimization and statistical mechanics has been developed and has proven useful in solving certain traditional optimization problems such as computer design, partitioning, component placement, wiring, and traveling salesman problems. The analogy has resulted in a methodology, termed “simulated annealing,” which, in the process of iterating to an optimum, uses Monte Carlo sampling to occasionally accept solutions to discrete optimization problems which increase rather than decrease the objective function value. This process is counter to the normal ‘steepest-descent’ algorithmic approach. However, it is argued in the analogy that by taking such controlled uphill steps, the optimizing algorithm need not get “stuck” on inferior solutions. This paper presents an application of the simulated annealing method to solve the quadratic assignment problem (QAP). Performance is tested on a set of “standard” problems, as well as some newly gen...

325 citations


Book
01 Jan 1987
TL;DR: Facts, Conjectures, and Improvements for Simulated Annealing brings together for the first time many of the theoretical foundations for improvements to algorithms for global optimization that until now existed only in scattered research articles.
Abstract: From the Publisher: Simulated annealing has proved to be an easy and reliable method for finding optimal values of a problem in cases where there is no road map to possible solutions. Facts, Conjectures, and Improvements for Simulated Annealing offers an introduction to this topic for novices and provides an informative review of the area for the more expert reader. This book brings together for the first time many of the theoretical foundations for improvements to algorithms for global optimization that until now existed only in scattered research articles. The method described in this book operates by simulating the cooling of a (usually fictitious) physical system whose possible energies correspond to the values of the objective function being minimized. The analogy works because physical systems occupy only states with the lowest energy as the temperature is lowered to absolute zero. his book is suitable for advanced undergraduate and graduate students and for professionals in a wide variety of subject areas: bioinformatics, chemistry, computer science, engineering, finance, geology, mathematics, and physics.

265 citations


Journal ArticleDOI
TL;DR: The fitting of some of these techniques to continuous variables problems gave very promising results; that question is not discussed in detail in the paper, but useful references allowing to deepen the subject are given.
Abstract: We present a review of the main “global optimization" methods. The paper comprises one introduction and two parts. In the introduction, we recall some generalities about non linear constraint-less optimization and we list some classifications which have been proposed for the global optimization methods. We then describe, in the first part, various “classical" global optimization methods, most of which available long before the appearance of Simulated Annealing (a key event in this field). There exists plenty of papers and books dealing with these methods, and studying in particular their convergence properties. The second part of the paper is devoted to more recent or atypical methods, mostly issued from combinatorial optimization. The three main methods are “metaheuristics": Simulated Annealing (and derived techniques), Tabu Search and Genetic Algorithms; we also describe three other less known methods. For these methods, theoretical studies of convergence are less abundant in the literature, and the use of convergence results is by far more limited in practice. However, the fitting of some of these techniques to continuous variables problems gave very promising results; that question is not discussed in detail in the paper, but useful references allowing to deepen the subject are given.

249 citations


Journal ArticleDOI
TL;DR: Simulated annealing is a computational heuristic for obtaining approximate solutions to combinatorial optimization problems and for certain sets of parameters codes that are better than any other known in the literature are found.
Abstract: Simulated annealing is a computational heuristic for obtaining approximate solutions to combinatorial optimization problems. It is used to construct good source codes, error-correcting codes, and spherical codes. For certain sets of parameters codes that are better than any other known in the literature are found.

230 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of finding local minima in a global minimum of $U:\mathbb{R}^n \to \mathbb {R}.
Abstract: We seek a global minimum of $U:\mathbb{R}^n \to \mathbb{R}$. The solution to $( * )({d / {dt}})X(t) = - abla U(X(t))$ will find local minima. Using the idea of simulated annealing, we consider th...

219 citations


Proceedings ArticleDOI
01 Dec 1987
TL;DR: A query optimization algorithm based on simulated annealing, which is a probabilistic hill climbing algorithm for optimizing complex non-recursive queries that arise in the study of linear recursion.
Abstract: Query optimizers of future database management systems are likely to face large access plan spaces in their task. Exhaustively searching such access plan spaces is unacceptable. We propose a query optimization algorithm based on simulated annealing, which is a probabilistic hill climbing algorithm. We show the specific formulation of the algorithm for the case of optimizing complex non-recursive queries that arise in the study of linear recursion. The query answer is explicitly represented and manipulated within the closed semiring of linear relational operators. The optimization algorithm is applied to a state space that is constructed from the equivalent algebraic forms of the query answer. A prototype of the simulated annealing algorithm has been built and few experiments have been performed for a limited class of relational operators. Our initial experience is that, in general, the algorithm converges to processing strategies that are very close to the optimal. Moreover, the traditional processing strategies (e.g., the semi-naive evaluation) have been found to be, in general, suboptimal.

Journal ArticleDOI
TL;DR: A modification of the classical Simulated Annealing algorithm for the macro-cell placement problem is proposed for implementation on multiprocessor systems and experimental results show that the new algorithm obtains results comparable in quality to those of the single processor version.
Abstract: A modification of the classical Simulated Annealing algorithm for the macro-cell placement problem is proposed for implementation on multiprocessor systems. The algorithm has been implemented on the Sequent Balance 8000, a multiprocessor system with a shared-memory architecture. Experimental results show that the new algorithm obtains results comparable in quality to those of the single processor version; processor utilization is greater than 80 percent using up to eight processors.

Journal ArticleDOI
TL;DR: Probabilistic analyses of different designs of simulated annealing methods for combinatorial optimization problems based on local neighborhood searches are provided.
Abstract: Heuristic solution methods for combinatorial optimization problems are often based on local neighborhood searches. These tend to get trapped in a local optimum and the final result is often heavily dependent on the starting solution. Simulated annealing methods attempt to avoid these problems by randomizing the procedure so as to allow for occasional changes that worsen the solution. In this paper we provide probabilistic analyses of different designs of these methods.

Journal ArticleDOI
TL;DR: It is shown that an adaptive strategy which switches between two parallel decompositions at the optimal temperature yields speedup significantly better than any single strategy approach, and models are developed to account for the observed performance, and to predict the crossover points for switching strategies.
Abstract: Physical design tools based on simulated annealing algorithms have been shown to produce results of extremely high quality, but typically at a very high cost in execution time. This paper selects a representative annealing application--standard cell placement--and develops multiprocessor-based annealing algorithms for placement. A taxonomy of possible multiprocessor decompositions of annealing algorithms is presented which divides decomposition schemes into two broad classes: those which divide individual moves into subtasks and distribute them across cooperating processors, and those which perform complete moves in parallel. It is shown that the choice of multiprocessor annealing strategy is influenced by temperature; in particular, the paper introduces the idea of adaptive strategies that dynamically change the parallel decomposition scheme to achieve maximum speedup as the annealing task progresses through each temperature regime. Implementations of three parallel placement strategies are described for an experimental shared-memory multiprocessor. Practical speedups are achieved over a serial version of the algorithm, and it is shown that an adaptive strategy which switches between two parallel decompositions at the optimal temperature yields speedup significantly better than any single strategy approach. Models are developed to account for the observed performance, and to predict the crossover points for switching strategies.

Journal ArticleDOI
TL;DR: In this paper, a condition is given for the simulations to be accurate in polynomial time, which is a condition for the simulated annealing algorithm to be robust to the second largest eigenvalue of a Markov chain.
Abstract: Uniform distributions on complicated combinatorial sets can be simulated by the Markov chain method. A condition is given for the simulations to be accurate in polynomial time. Similar analysis of the simulated annealing algorithm remains an open problem. The argument relies on a recent eigenvalue estimate of Alon [4]; the only new mathematical ingredient is a careful analysis of how the accuracy of sample averages of a Markov chain is related to the second-largest eigenvalue.

Journal ArticleDOI
TL;DR: The equilibrium geometries of selenium clusters Se 3 to Se 8 have been calculated using a parameter-free density functional method, and the most prominent low-energy structures are found using combined molecular dynamics and simulated annealing techniques as mentioned in this paper.

Journal ArticleDOI
01 Nov 1987
TL;DR: In this paper, the authors presented a simulated annealing technique, which is t/log (t) times faster than conventional simulated-annealing, and applied it to a multisensor location and tracking problem.
Abstract: Recent advances in the solution of nonconvex optimization problems use simulated annealing techniques that are considerably faster than exhaustive global search techniques. This letter presents a simulated annealing technique, which is t/log (t) times faster than conventional simulated annealing, and applies it to a multisensor location and tracking problem.


Journal ArticleDOI
TL;DR: In this paper, the classical potential energy surfaces for clusters of up to 25 atoms, interacting under two-body Lennard-Jones forces, have been searched for global minima using the simulated annealing method.

Journal ArticleDOI
TL;DR: In this paper architectures for partitioning an optoelectronic analog of a neural net into distinct layers with prescribed interconnectivity pattern to enable stochastic learning by simulated annealing in the context of a Boltzmann machine are presented.
Abstract: Self-organization and learning is a distinctive feature of neural nets and processors that sets them apart from conventional approaches to signal processing. It leads to self-programmability which alleviates the problem of programming complexity in artificial neural nets. In this paper architectures for partitioning an optoelectronic analog of a neural net into distinct layers with prescribed interconnectivity pattern to enable stochastic learning by simulated annealing in the context of a Boltzmann machine are presented. Stochastic learning is of interest because of its relevance to the role of noise in biological neural nets. Practical considerations and methodologies for appreciably accelerating stochastic learning in such a multilayered net are described. These include the use of parallel optical computing of the global energy of the net, the use of fast nonvolatile programmable spatial light modulators to realize fast plasticity, optical generation of random number arrays, and an adaptive noisy thresholding scheme that also makes stochastic learning more biologically plausible. The findings reported predict optoelectronic chips that can be used in the realization of optical learning machines.

Journal ArticleDOI
TL;DR: Modifications to the standard simulated annealing method for circuit placement are explored which make it more suitable for use on a shared-memory parallel computer and allow the parallel algorithms to deviate from the algorithm defined for a serial computer.
Abstract: We explore modifications to the standard simulated annealing method for circuit placement which make it more suitable for use on a shared-memory parallel computer. By employing chaotic approaches we allow the parallel algorithms to deviate from the algorithm defined for a serial computer and thus obtain good execution efficiencies for large numbers of processors. The qualitative behavior of the parallel algorithms is comparable to that of the serial algorithm.

Book ChapterDOI
01 Jan 1987
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems as mentioned in this paper.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

Book ChapterDOI
01 Jan 1987
TL;DR: The performance analysis of an approximation algorithm concentrates on the quality of the final solution obtained by the algorithm, and the running time required by the algorithms.
Abstract: The performance analysis of an approximation algorithm concentrates on the following two quantities: the quality of the final solution obtained by the algorithm, i.e. the difference in cost value between the final solution and a globally minimal configuration; the running time required by the algorithm.

Journal ArticleDOI
15 Aug 1987-EPL
TL;DR: A model which self-organizes to perform a task via a learning-by-example scheme which is a network of Boolean operators which has been able to achieve an error-free design for addition between integer binary numbers when shown only a small subset of all possible additions.
Abstract: We realized a model which self-organizes to perform a task via a learning-by-example scheme. The system is a network of Boolean operators which, in some of our computations, has been able to achieve an error-free design for addition between integer binary numbers when shown only a small subset of all possible additions. The training procedure, based on optimizing the network on the given sampling using simulated annealing, is completely general and allows in principle to treat any binary mapping. We recognize different regimes in learning, i.e. the system can both memorize patterns (with a capacity which is numerically estimated) and generalize information to construct rules and algorithms. Some scaling relations are conjectured and numerically tested for these different regimes.

Proceedings ArticleDOI
01 Jan 1987
TL;DR: The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm.
Abstract: The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.

Proceedings ArticleDOI
01 Oct 1987
TL;DR: ESP (Evolution-based Standard cell Placement) is a new program package designed to perform standard cell placement and includes macro-block placement capabilities and uses the new heuristic method of simulating an evolutionary process in order to minimize the cell interconnection wire length.
Abstract: ESP (Evolution-based Standard cell Placement) is a new program package designed to perform standard cell placement and includes macro-block placement capabilities. It uses the new heuristic method of simulating an evolutionary process in order to minimize the cell interconnection wire length. While achieving results comparable to or better than the popular Simulated Annealing algorithm, ESP performs its task about ten times faster.

Journal ArticleDOI
TL;DR: The two ergodicity concepts are equivalent for finite chains under rather general (and widely verifiable) conditions and applications to probabilistic analyses of general search methods for combinatorial optimization problems (simulated annealing) are discussed.
Abstract: A nonstationary Markov chain is weakly ergodic if the dependence of the state distribution on the starting state vanishes as time tends to infinity. A chain is strongly ergodic if it is weakly ergodic and converges in distribution. In this paper we show that the two ergodicity concepts are equivalent for finite chains under rather general (and widely verifiable) conditions. We discuss applications to probabilistic analyses of general search methods for combinatorial optimization problems (simulated annealing).

Book ChapterDOI
01 Jan 1987
TL;DR: The basic theory of simulated annealing is reviewed and a number of applications of the method are recited, including combinatorial optimization problems related to VLSI design, image processing, code design and artificial intelligence.
Abstract: Simulated annealing is a combinatorial optimization method based on randomization techniques. The method originates from the analogy between the annealing of solids, as described by the theory of statistical physics, and the optimization of large combinatorial problems. Here we review the basic theory of simulated annealing and recite a number of applications of the method. The theoretical review includes concepts of the theory of homogeneous and inhomogeneous Markov chains, an analysis of the asymptotic convergence of the algorithm, and a discussion of the finite-time behaviour. The list of applications includes combinatorial optimization problems related to VLSI design, image processing, code design and artificial intelligence.

Book ChapterDOI
Emile H. L. Aarts1, Jan Korst1
15 Jun 1987
TL;DR: A formal model of the Boltzmann machine is presented and a discussion of two different applications of the model, viz. solving combinatorial optimization problems and carrying out learning tasks are discussed.
Abstract: In this paper we present a formal model of the Boltzmann machine and a discussion of two different applications of the model, viz. (i) solving combinatorial optimization problems and (ii) carrying out learning tasks. Numerical results of computer simulations are presented to demonstrate the characteristic features of the Boltzmann machine.

01 Jan 1987
TL;DR: A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer and is faster and gives better final placement results than the uniprocessor simulatedAnnealing algorithms.
Abstract: A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.