scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 1985"


Journal ArticleDOI
21 Oct 1985
TL;DR: A natural class PLS is defined consisting essentially of those local search problems for which local optimality can be verified in polynomial time, and it is shown that there are complete problems for this class.
Abstract: We investigate the complexity of finding locally optimal solutions to NP-hard combinatorial optimization problems. Local optimality arises in the context of local search algorithms, which try to find improved solutions by considering perturbations of the current solution (“neighbors” of that solution). If no neighboring solution is better than the current solution, it is locally optimal. Finding locally optimal solutions is presumably easier than finding optimal solutions. Nevertheless, many popular local search algorithms are based on neighborhood structures for which locally optimal solutions are not known to be computable in polynomial time, either by using the local search algorithms themselves or by taking some indirect route. We define a natural class PLS consisting essentially of those local search problems for which local optimality can be verified in polynomial time, and show that there are complete problems for this class. In particular, finding a partition of a graph that is locally optimal with respect to the well-known Kernighan-Lin algorithm for graph partitioning is PLS-complete, and hence can be accomplished in polynomial time only if local optima can be found in polynomial time for all local search problems in PLS.

792 citations


Journal ArticleDOI
TL;DR: TimberWolf is an integrated set of placement and routing optimization programs for standard cell, macro/custom cell, and gate-array placement, as well as standard cell global routing.
Abstract: TimberWolf is an integrated set of placement and routing optimization programs. The general combinatorial optimization technique known as simulated annealing is used by each program. Programs for standard cell, macro/custom cell, and gate-array placement, as well as standard cell global routing, have been developed. Experimental results on industrial circuits show that area savings over existing layout programs ranging from 15 to 62% are possible.

482 citations


Proceedings ArticleDOI
01 Dec 1985
TL;DR: In this paper, a theoretical analysis of simulated annealing based on a time-inhomogeneous Markov chain is presented and a bound on the departure of the probability distribution of the state at finite time from the optimum is given.
Abstract: Simulated Annealing is a randomized algorithm which has been proposed for finding globally optimum least-cost configurations in large NP-complete problems with cost functions which may have many local minima. A theoretical analysis of Simulated Annealing based on its precise model, a time-inhomogeneous Markov chain, is presented. An annealing schedule is given for which the Markov chain is strongly ergodic and the algorithm converges to a global optimum. The finite-time behavior of Simulated Annealing is also analyzed and a bound obtained on the departure of the probability distribution of the state at finite time from the optimum. This bound gives an estimate of the rate of convergence and insights into the conditions on the annealing schedule which gives optimum performance.

307 citations


Journal ArticleDOI
01 Oct 1985-Nature
TL;DR: Computer algorithms are used to investigate new strategies for the 64-city travelling salesman problem, which combine conventional optimization or ‘quenching’ with biological elements, namely having a population of trial solutions, helping weaker individuals to survive, and an analogue of sexual crossing-over of genes.
Abstract: Several problems, in particular the ‘travelling salesman’ problem1 wherein one seeks the shortest route encompassing a randomly distributed group of cities, have been optimized by repeated random alteration (mutation) of a trial solution followed by selection of the cheaper (fitter) solution. Most non-trivial problems have complicated fitness functions, and optimization tends to become stuck in local fitness maxima. A recently introduced strategy to escape (simulated annealing) involves accepting unfavourable mutations with finite probability1–3. Independently, there has been interest in genetic strategies which overcome the problem of fitness maxima in biological evolution4–6, and several authors have applied biological elements to optimization7,8. Here we use computer algorithms to investigate new strategies for the 64-city travelling salesman problem, which combine conventional optimization or ‘quenching’ with biological elements, namely having a population of trial solutions, helping weaker individuals to survive, and an analogue of sexual crossing-over of genes. The new strategies were faster and gave better results than simulated annealing.

195 citations


Dissertation
01 Sep 1985
TL;DR: The goal of this work is to demonstrate the generality and practical value of a probabilistic approach to this problem, particularly in the context of Computer Vision, with a Gibbsian probability distribution on the space of all possible functions.
Abstract: : In this thesis we study the general problem of reconstructing a function, defined on a finite lattice, from a set of incomplete, noisy and/or ambiguous observations. The goal of this work is to demonstrate the generality and practical value of a probabilistic (in particular, Bayesian) approach to this problem, particularly in the context of Computer Vision. In this approach, the prior knowledge about the solution is expressed in the form of a Gibbsian probability distribution on the space of all possible functions, so that the reconstruction task in formulated as an estimation problem. Keywords: Inverse problems; Computer vision; Surface interpolation; Image restoration; Markov random fields; Optimal estimation; Simulated annealing.

166 citations


Proceedings ArticleDOI
01 Dec 1985
TL;DR: The basic theory of simulated annealing is reviewed, its recent applications are surveyed, and the theoretical approaches that have been used to study the technique are surveyed.
Abstract: Annealing is the process of slowly cooling a physical system in order to obtain states with globally minimum energy. By simulating such a process, near globally-minimum-cost solutions can be found for very large optimization problems. The purpose of this paper is to review the basic theory of simulated annealing, to survey its recent applications, and to survey the theoretical approaches that have been used to study the technique. The applications include image restoration, combinatorial optimization (eg VLSI routing and placement), code design for communication systems and certain aspects of artificial intelligence. The theoretical tools for analysis include the theory of nonstationary Markov chains, statistical physics analysis techniques, large deviation theory and singular perturbation theory.

151 citations


Journal ArticleDOI
P. Carnevali1, L. Coletti1, S. Patarnello1
TL;DR: It is shown that simulated annealing, a statistical mechanics method recently proposed as a tool in solving complex optimization problems, can be used in problems arising in image processing, and some of these problems are formally equivalent to ground state problems for two-dimensional Ising spin systems.
Abstract: It is shown that simulated annealing, a statistical mechanics method recently proposed as a tool in solving complex optimization problems, can be used in problems arising in image processing. The problems examined are the estimation of the parameters necessary to describe a geometrical pattern corrupted by noise, the smoothing of bi-level images, and the process of halftoning a continuous-level image. The analogy between the system to be optimized and an equivalent physical system, whose ground state is sought, is put forward by showing that some of these problems are formally equivalent to ground state problems for two-dimensional Ising spin systems. In the case of low signal-to-noise ratios (particularly in image smoothing), the methods proposed here give better results than those obtained with standard techniques.

149 citations


Proceedings ArticleDOI
01 Sep 1985
TL;DR: This work analyzes a nonstationary finite state Markov chain whose state space Ω is the domain of the cost function to be minimized, and considers an arbitrary partition optimization {I, J} of Ω of this chain focusing on those issues most important for optimization.
Abstract: Simulated annealing is a popular Monte Carlo algorithm for combinatorial optimization. The annealing algorithm simulates a nonstationary finite state Markov chain whose state space ? is the domain of the cost function to be minimized. We analyze this chain focusing on those issues most important for optimization. In all of our results we consider an arbitrary partition optimization {I, J} of ? important special cases are when I is the set of minimum cost states or a set of all states with sufficiently small cost. We give a lower bound on the probability that the chain visits I at some time ? k, for k = 1,2, .... This bound may be useful even when the algorithm does not converge. We give conditions under which the chain converges to I in probability and obtain an estimate of the rate of convergence as well. We also give conditions under which the chain visits I infinitely often, visits I almost always, or does not converge to I, with probability 1.

74 citations


Proceedings ArticleDOI
01 Jun 1985
TL;DR: The performance of simulated annealing is compared to that of other Monte Carlo methods for optimization and it is shown that these other methods often perform better than simulatedAnnealing.
Abstract: The performance of simulated annealing is compared to that of other Monte Carlo methods for optimization. Our experiments show that these other methods often perform better than simulated annealing.

51 citations


Journal ArticleDOI
TL;DR: In this article, a simulation of the dynamical approach to local or global minima of a system of interacting fine ferromagnetic particles is developed for two different schedules of the application of ac and dc magnetic fields.
Abstract: Using a model of a system of interacting fine ferromagnetic particles, a computer simulation of the dynamical approach to local or global minima of the system is developed for two different schedules of the application of ac and dc magnetic fields. The process of optimization, i.e., the achievement of a global minimum, depends on the rate of reduction of the ac field and on the symmetry of the ac field cycles, The calculations carried out to illustrate these effects include remanence curves and the zero field remanence for both schedules under different conditions. The growth of the magnetization during these processes was studied, and the interaction energy was calculated to best illustrate the optimization.

49 citations



Proceedings ArticleDOI
01 Apr 1985
TL;DR: Simulated annealing is experimentally shown to locate what appears to be the global maximum with a higher probability than the forward-backward algorithm.
Abstract: Hidden Markov models (HMM) are the basis for some of the more successful systems for continuous and discrete utterance speech recognition. One of the reasons for the success of these models is their ability to train automatically from marked speech data. The currently known forward-backward and gradient training methods suffer from the problem that they converge to a local maximum rather than to the global maximum. Simulated annealing is a stochastic optimization procedure which can escape a local optimum in the hope of finding the global optimum when presented with a system which contains many local optima. This paper shows how simulated annealing may be used to train HMM systems. It is experimentally shown to locate what appears to be the global maximum with a higher probability than the forward-backward algorithm.

Journal ArticleDOI
TL;DR: An algorithm is described which can determine a neighborhood of the global optimum of an objective function as well as an estimate of theGlobal optimum given this information, and a local optimization procedure can be employed to locate theglobal optimum.
Abstract: This paper describes an algorithm which can determine a neighborhood of the global optimum of an objective function as well as an estimate of the global optimum. Given this information, a local optimization procedure can be employed to locate the global optimum. The utility of this algorithm is demonstrated by several examples.

Journal ArticleDOI
TL;DR: In this article, the minimum energy configuration of N point charges confined to the interior of a circle is determined by a technique based on the simulated-annealing method, which can lead to a better understanding of phenomena such as crystallisation, symmetry breaking, commensurate-incommensurate transition, etc.
Abstract: The authors have determined the minimum energy configuration of N point charges confined to the interior of a circle. The minimisation problem in the multi-dimensional configuration space is solved by a technique based on the simulated-annealing method. They observe striking effects, which could lead to a better understanding of phenomena such as crystallisation, symmetry breaking, commensurate-incommensurate transition, etc. Moreover, an experimental verification of their results appears to be possible.

Journal ArticleDOI
TL;DR: A radically new approach to facility layout optimization involving nonconvex quadratic assignment problems is presented, using a simulated annealing technique originally developed to solve problems in statistical mechanics by Metropolis et al and recently applied to VLSI chip design problems.
Abstract: A radically new approach to facility layout optimization involving nonconvex quadratic assignment problems is presented. The approach uses a simulated annealing technique originally developed to solve problems in statistical mechanics by Metropolis et al, and recently applied to VLSI chip design problems. The Metropolis algorithm is relatively simple to apply and a microcomputer model called TOPMET has been developed. TOPMET is shown to produce superior solutions to some of the more popular computer-planning techniques and hand-generated methods. The algorithm also lends itself readily to user interaction and colour graphics display, and its application is illustrated by a practical building problem. Extensions into artificial intelligence are discussed.

Journal ArticleDOI
TL;DR: In this article, an efficient perturbation method for system optimization is defined, and the role of simulated annealing in such an optimization is discussed, along with an experiment testing the effects of anneal.
Abstract: Reconstructions from coded-image data obtained with the simulated annealing algorithm are presented along with an experiment testing the effects of annealing. An efficient perturbation method for system optimization is defined, and the role of simulated annealing in such an optimization is discussed.

Journal ArticleDOI
TL;DR: A linear heuristic method for the via minimization problem for 2 layers, which is similar to the simulation of the process of cristallization, is introduced, to accept rearrangements that lower the cost function but to allow also controlled uphill steps.


Journal ArticleDOI
TL;DR: The worst-case and typical estimates for the rate of convergence of annealing algorithms are developed and typical case results help to explain this phenomenon.
Abstract: We develop worst-case and typical estimates for the rate of convergence of annealing algorithms. The worst-case estimates extend results of Geman and Geman (1983) for pattern recognition. However, as Geman and Geman observe, and empirical results imply, simulated annealing usually displays much faster convergence (and thus allows much faster cooling) than their worst-case results. Our typical case results help to explain this phenomenon.




01 Jan 1985
TL;DR: The performance of simulated annealing is compared to that of other Monte Carlo methods for optimization as discussed by the authors, and it is shown that these other methods often perform better than Simulated Annealing.
Abstract: The performance of simulated annealing is compared to that of other Monte Carlo methods for optimization. Our experiments show that these other methods often perform better