scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 1990"


Journal ArticleDOI
TL;DR: In this article, a phase annealing method, related to the simulated-annealing approach in other optimization problems, is proposed and it is shown that it can result in an improvement of up to an order of magnitude in the chances of solving large structures at atomic resolution.
Abstract: A number of extensions to the multisolution approach to the crystallographic phase problem are discussed in which the negative quartet relations play an important role. A phase annealing method, related to the simulated annealing approach in other optimization problems, is proposed and it is shown that it can result in an improvement of up to an order of magnitude in the chances of solving large structures at atomic resolution. The ideas presented here are incorporated in the program system SHELX-90; the philosophical and mathematical background to the direct-methods part (SHELXS) of this system is described.

14,787 citations


Journal ArticleDOI
TL;DR: Simulated Annealing in practice as discussed by the authors : Simulated annealing is an algorithm for parallel simulated Annealing algorithms with Boltzmann machines, which can be found in the literature.
Abstract: SIMULATED ANNEALING. Combinatorial Optimization. Simulated Annealing. Asymptotic Convergence. Finite-Time Approximation. Simulated Annealing in Practice. Parallel Simulated Annealing Algorithms. BOLTZMANN MACHINES. Neural Computing. Boltzmann Machines. Combinatorial Optimization and Boltzmann Machines. Classification and Boltzmann Machines. Learning and Boltzmann Machines. Appendix. Bibliography. Indices.

1,238 citations


Journal ArticleDOI
Gunter Dueck1, Tobias Scheuer1
TL;DR: In this article, a new general purpose algorithm for the solution of combinatorial optimization problems is presented, which is even simpler structured than the wellknown simulated annealing approach, and demonstrated by computational results concerning the traveling salesman problem and the problem of the construction of error-correcting codes.

1,152 citations


Journal ArticleDOI
TL;DR: The Simulated Annealing algorithm and the physical analogy on which it is based are described and suggestions are made for ways to improve the performance of the algorithm by modifying the ‘pure’ SA approach.

781 citations


Journal ArticleDOI
TL;DR: It is concluded that the channel-optimized vector quantizer design algorithm, if used carefully, can result in a fairly robust system with no additional delay.
Abstract: Several issues related to vector quantization for noisy channels are discussed. An algorithm based on simulated annealing is developed for assigning binary codewords to the vector quantizer code-vectors. It is shown that this algorithm could result in dramatic performance improvements as compared to randomly selected codewords. A modification of the simulated annealing algorithm for binary codeword assignment is developed for the case where the bits in the codeword are subjected to unequal error probabilities (resulting from unequal levels of error protection). An algorithm for the design of an optimal vector quantizer for a noisy channel is briefly discussed, and its robustness under channel mismatch conditions is studied. Numerical results for a stationary first-order Gauss-Markov source and a binary symmetric channel are provided. It is concluded that the channel-optimized vector quantizer design algorithm, if used carefully, can result in a fairly robust system with no additional delay. The case in which the communication channel is nonstationary (as in mobile radio channels) is studied, and some preliminary ideas for quantizer design are presented. >

509 citations


Proceedings ArticleDOI
01 May 1990
TL;DR: A new Two Phase Optimization algorithm is inspired, which is a combination of Simulated Annealing and Iterative Improvement, which outperforms the original algorithms in terms of both output quality and running time.
Abstract: Query optimization for relational database systems is a combinatorial optimization problem, which makes exhaustive search unacceptable as the query size grows. Randomized algorithms, such as Simulated Annealing (SA) and Iterative Improvement (II), are viable alternatives to exhaustive search. We have adapted these algorithms to the optimization of project-select-join queries. We have tested them on large queries of various types with different databases, concluding that in most cases SA identifies a lower cost access plan than II. To explain this result, we have studied the shape of the cost function over the solution space associated with such queries and we have conjectured that it resembles a 'cup' with relatively small variations at the bottom. This has inspired a new Two Phase Optimization algorithm, which is a combination of Simulated Annealing and Iterative Improvement. Experimental results show that Two Phase Optimization outperforms the original algorithms in terms of both output quality and running time.

387 citations


Journal ArticleDOI
TL;DR: In this article, Simulated Annealing is applied to the Quadratic Assignment Problem (i.e., the assignment of inter-communicating objects to locations to minimize the total cost of communication between them).

385 citations


Journal ArticleDOI
TL;DR: A solution algorithm to the network reconfiguration problem, which is a constrained, multiobjective, nondifferentiable, optimization problem, that allows the designer to obtain a desirable, global noninferior point in a reasonable computation time.
Abstract: Using a two-stage solution methodology and a modified simulated annealing technique, the authors develop a solution algorithm to the network reconfiguration problem, which is a constrained, multiobjective, nondifferentiable, optimization problem. This solution algorithm allows the designer to obtain a desirable, global noninferior point in a reasonable computation time. Also, given a desired number of switch-on/switch-off operations involved in the network configuration, the solution algorithm can identify the most effective operations. In order to reduce the computation time required, the idea of approximate calculations is explored and incorporated into the solution algorithm, where two efficient load-flow methods are employed; one for high temperature and the other for low temperature. The solution algorithm has been implemented in a software package and tested on a 69-bus system with very promising results. >

379 citations


Journal ArticleDOI
TL;DR: In this article, a general optimization method, known as simulated annealing, is applied to generation unit commitment, by exploiting the resemblance between a minimization process and the cooling of a molten metal, generated feasible solutions randomly and moves among these solutions using a strategy leading to a global minimum with high probabilities.
Abstract: A general optimization method, known as simulated annealing, is applied to generation unit commitment. By exploiting the resemblance between a minimization process and the cooling of a molten metal, simulated annealing generates feasible solutions randomly and moves among these solutions using a strategy leading to a global minimum with high probabilities. The method assumes no specific problem structures and is highly flexible in handling unit commitment constraints. A concise introduction to the method is given. Numerical results on test systems of up to 100 units are reported. >

371 citations


Journal ArticleDOI
TL;DR: In this paper, different heuristics for the maximum 2-Satisfiability problem were compared with the approximation algorithms of Johnson and of Lieberherr for the general maximum satisfaction problem.
Abstract: Old and new algorithms for the Maximum Satisfiability problem are studied We first summarize the different heuristics previously proposed, ie, the approximation algorithms of Johnson and of Lieberherr for the general Maximum Satisfiability problem, and the heuristics of Lieberherr and Specker, Poljak and Turzik for the Maximum 2-Satisfiability problem We then consider two recent local search algorithmic schemes, the Simulated Annealing method of Kirkpatrick, Gelatt and Vecchi and the Steepest Ascent Mildest Descent method, and adapt them to the Maximum Satisfiability problem The resulting algorithms, which avoid being blocked as soon as a local optimum has been found, are shown empirically to be more efficient than the heuristics previously proposed in the literature

342 citations


Journal ArticleDOI
TL;DR: In this article, a two-stage solution methodology based on a modified simulated annealing technique and the epsilon -constraint method for general multiobjective optimization problems is developed.
Abstract: A new formulation of the network reconfiguration problem for both loss reduction and load balancing that takes into consideration load constraints and operational constraints is presented. The number of switch-on/switch-off operations involved in network reconfiguration is put into a constraint. The new formulation is a constrained, multiobjective and nondifferential optimization problem with both equality and inequality constraints. A two-stage solution methodology based on a modified simulated annealing technique and the epsilon -constraint method for general multiobjective optimization problems is developed. A salient feature of the solution methodology is that it allows designers to find a desirable, global noninferior solution for the problem. An effective scheme to speed up the solution methodology is presented and analyzed. >

Journal ArticleDOI
TL;DR: It is found that the proposed simulated annealing algorithm provides better solutions than repeated iterative improvement algorithm, for a fixed total computational time.

Journal ArticleDOI
TL;DR: In this paper, a novel formulation of the general capacitor placement problem taking into consideration practical aspects of capacitors, the load constraints, and the operational constraints at different load levels is presented.
Abstract: A novel formulation of the general capacitor placement problem taking into consideration practical aspects of capacitors, the load constraints, and the operational constraints at different load levels is presented. This formulation is a combinatorial optimization problem with a nondifferentiable objective function. A solution methodology based on an optimization technique (simulated annealing) is proposed to determine the locations where capacitors are to be installed, the types and sizes of capacitors to be installed, and the control settings of these capacitors at different load levels. The solution methodology can offer the global optimal solution for the general capacitor placement problem. >

Journal Article
TL;DR: In this note, the motivation, the theory of operation, some proof-of-principle computational experiments, and a Pascal implementation of the algorithm are presented.
Abstract: This note describes a selection procedure for genetic algorithms called Boltzmann tournament selection. As simulated annealing evolves a Boltzmann distribution in time using the Metropolis algorithm or a logistic acceptance mechanism, Boltzmann tournament selection evolves a Boltzmann distribution across a population and time using pairwise probabilistic acceptance and anti-acceptance mechanisms. In this note, the motivation , the theory of operation, some proof-of-principle computational experiments, and a Pascal implementation of the algorithm are presented. The efficient use of Boltzmann tournament selection on parallel hardware and its connection to other niching mechanisms are also considered.

Journal ArticleDOI
TL;DR: The temperature behavior of MFA during bipartitioning is analyzed and shown to have an impact on the tuning of neural networks for improved performance, and a new modification to MFA is presented that supports partitioning of random or structured graphs into three or more bins-a problem that has previously shown resistance to solution by neural networks.
Abstract: A new algorithm, mean field annealing (MFA), is applied to the graph-partitioning problem. The MFA algorithm combines characteristics of the simulated-annealing algorithm and the Hopfield neural network. MFA exhibits the rapid convergence of the neural network while preserving the solution quality afforded by simulated annealing (SA). The rate of convergence of MFA on graph bipartitioning problems is 10-100 times that of SA, with nearly equal quality of solutions. A new modification to mean-field annealing is also presented which supports partitioning graphs into three or more bins, a problem which has previously shown resistance to solution by neural networks. The temperature-behavior of MFA during graph partitioning is analyzed approximately and shown to possess a critical temperature at which most of the optimization occurs. This temperature is analogous to the gain of the neurons in a neural network and can be used to tune such networks for better performance. The value of the repulsion penalty needed to force MFA (or a neural network) to divide a graph into equal-sized pieces is also estimated. >

Journal ArticleDOI
TL;DR: The results indicate that a placement comparable in quality can be obtained in about the same execution time as TimberWolf, but the genetic algorithm needs to explore 20-50 times fewer configurations than does TimberWolf.
Abstract: The genetic algorithm applies transformations on the chromosonal representation of the physical layout. The algorithm works on a set of configurations constituting a constant-size population. The transformations are performed through crossover operators that generate a new configuration assimilating the characteristics of a pair of configurations existing in the current population. Mutation and inversion operators are also used to increase the diversity of the population, and to avoid premature convergence at local optima. Due to the simultaneous optimization of a large population of configurations, there is a logical concurrency in the search of the solution space which makes the genetic algorithm an extremely efficient optimizer. Three efficient crossover techniques are compared, and the algorithm parameters are optimized for the cell-placement problem by using a meta-genetic process. The resulting algorithm was tested against TimberWolf 3.3 on five industrial circuits consisting of 100-800 cells. The results indicate that a placement comparable in quality can be obtained in about the same execution time as TimberWolf, but the genetic algorithm needs to explore 20-50 times fewer configurations than does TimberWolf. >

Book ChapterDOI
01 Oct 1990
TL;DR: In this article, an abstract stochastic algorithm for combinatorial optimization problems is proposed, which generalizes and unifies genetic algorithms and simulated annealing, such that any GA or SA algorithm at hand is an instance of the abstract algorithm.
Abstract: In this paper we are trying to make a step towards a concise theory of genetic algorithms (GAs) and simulated annealing (SA). First, we set up an abstract stochastic algorithm for treating combinatorial optimization problems. This algorithm generalizes and unifies genetic algorithms and simulated annealing, such that any GA or SA algorithm at hand is an instance of our abstract algorithm. Secondly, we define the evolution belonging to the abstract algorithm as a Markov chain and find conditions implying that the evolution finds an optimum with probability 1. The results obtained can be applied when designing the components of a genetic algorithm.

Journal ArticleDOI
TL;DR: It is demonstrated that the SAS approach is much more accurate and efficient on the test examples tried than the FSA algorithm, and convenient statistical criteria are proposed and used for algorithm performance evaluation, together with test examples of controlled properties.

Journal ArticleDOI
TL;DR: Results indicate that tabu search consistently outperforms simulated annealing with respect to computation time while giving comparable solutions to traveling salesman problem problems.
Abstract: This paper describes serial and parallel implementations of two different search techniques applied to the traveling salesman problem. A novel approach has been taken to parallelize simulated annealing and the results are compared with the traditional annealing algorithm. This approach uses abbreviated cooling schedule and achieves a superlinear speedup. Also a new search technique, called tabu search, has been adapted to execute in a parallel computing environment. Comparison between simulated annealing and tabu search indicate that tabu search consistently outperforms simulated annealing with respect to computation time while giving comparable solutions. Examples include 25, 33, 42, 50, 57, 75 and 100 city problems.

Journal ArticleDOI
TL;DR: It is shown how the simulated annealing algorithm (a Monte-Carlo global minimisation technique) can be applied to the blind deconvolution problem (i.e the problem of recovering two functions from their convolution).

Journal ArticleDOI
TL;DR: A general solution algorithm based on simulated annealing for optimal capacitor placements in distribution systems is proposed and analyzed and can provide the global optimal solution for the capacitor placement problem.
Abstract: For pt.I see ibid., vol.5, no.2, p.634-42 (1990). A general solution algorithm based on simulated annealing for optimal capacitor placements in distribution systems is proposed and analyzed. The solution algorithm can provide the global optimal solution for the capacitor placement problem. The solution algorithm has been implemented into a software package and tested on a 69 bus system with very promising results. >

Journal ArticleDOI
TL;DR: A new conformation searching algorithm called simulated annealing is reported, which is a Metropolis Monte Carlo approach to conformation of generation in which both the energy and temperature dependence of the Boltzmann distribution guides the search for the global minimum.
Abstract: We report the application of a new conformation searching algorithm called simulated annealing to the location of the global minimum energy conformation of peptides. Simulated annealing is a Metropolis Monte Carlo approach to conformation generation in which both the energy and temperature dependence of the Boltzmann distribution guides the search for the global minimum. Both uphill and downhill moves are possible, which allows the molecule to escape from local minima. Applications to the 20 natural amino acid "dipeptide models" as well as to polyalanines up to Ala80 are very successful in finding the lowest energy conformation. A history file of the simulated annealing process allows reconstruction and examination of the random walk around conformation space. A separate program, Conf-Gen, reads the history file and extracts all low-energy conformations visited during the run.

Journal ArticleDOI
TL;DR: This paper provides a comprehensive, taxonomic survey of parallel simulated annealing techniques, highlighting their performance and applicability.

Journal ArticleDOI
TL;DR: This paper examines some of the characteristics of AI-based heuristic procedures that have emerged as frameworks for solving difficult optimization problems and discusses briefly the relevance of a supplementary framework, called target analysis, which is a method for determining good decision rules to enable heuristics to perform more effectively.
Abstract: This paper examines some of the characteristics of AI-based heuristic procedures that have emerged as frameworks for solving difficult optimization problems. Consideration of attributes shared to some degree by human problem solvers leads to focusing in greater detail on one of the more successful procedures, tabu search, which employs a flexible memory system (in contrast to “memoryless” systems, as in simulated annealing and genetic algorithms, and rigid memory systems as in branch and bound and A* search). Specific attention is given to the short-term memory component of tabu search, which has provided solutions superior to the best obtained by other methods for a variety of problems. Our development emphasizes the principles underlying the interplay between restricting the search to avoid unproductive retracing of paths (by means of tabu conditions) and freeing the search to explore otherwise forbidden avenues (by aspiration criteria). Finally, we discuss briefly the relevance of a supplementary framework, called target analysis, which is a method for determining good decision rules to enable heuristics to perform more effectively.

Journal ArticleDOI
21 Sep 1990-Science
TL;DR: A numerical method of accurately determining T*, the critical temperature at which a phase change occurs, has been developed and the a posteriori probability distribution of the solution has been constructed from trial solutions generated at T*.
Abstract: Knowledge of the critical temperature, T * , the temperature at which a phase change occurs, greatly improves the efficiency of simulated annealing when used for optimization or inversion. A numerical method of accurately determining T * in a relatively short computation time has been developed. This method is used to recover the seismic soundspeed profile from wavefield data, a problem in which cycle skipping causes many local minima of the energy function and the averaging of the medium by finite length waves results in many states with similar energies. Computations indicate that it is cost-effective to spend about 80 percent of the computing budget looking for T * instead of annealing, and that in the course of finding T * many states with energies near the global minimum will also be found. The a posteriori probability distribution of the solution has been constructed from trial solutions generated at T * .

Journal ArticleDOI
TL;DR: Although computationally intensive, when it is carefully implemented, simulated annealing is found to give superior results to more traditional methods of nonlinear optimization.
Abstract: The oceanographic experiment design problem is discussed in the context of several simple examples drawn from acoustic tomography. The optimization of an objective function—chosen to characterize the array design— is carried out using the technique of simulated annealing. A detailed description of this method and its implementation for the examples above, is provided. Although computationally intensive, when it is carefully implemented, simulated annealing is found to give superior results to more traditional methods of nonlinear optimization.

Journal ArticleDOI
TL;DR: The recursive allocation scheme is shown to be effective on a number of large test task graphs-its solution quality is nearly as good as that produced by simulated annealing, and its computation time is several orders of magnitude less.

Journal ArticleDOI
Carsten Peterson1
TL;DR: The results from 50-, 100-, and 200-city TSP benchmarks presented at the 1989 Neural Information Processing Systems postconference workshop are presented and compared with a state-of-the-art hybrid approach consisting of greedy solutions, exhaustive search, and simulated annealing.
Abstract: We present and summarize the results from 50-, 100-, and 200-city TSP benchmarks presented at the 1989 Neural Information Processing Systems (NIPS) postconference workshop using neural network, elastic net, genetic algorithm, and simulated annealing approaches. These results are also compared with a state-of-the-art hybrid approach consisting of greedy solutions, exhaustive search, and simulated annealing.

Proceedings ArticleDOI
W. Swartz1, Carl Sechen1
01 Jan 1990
TL;DR: Novel algorithms are described for timing driven placement and routing of rectilinearly shaped macro cells and a negative feedback scheme is described that optimizes the relative weighting between the primary objective term and the penalty function terms in the cost function.
Abstract: Novel algorithms are described for timing driven placement and routing of rectilinearly shaped macro cells. Algorithms are also presented for the implementation of simulated annealing, based on a theoretically derived statistical annealing schedule. A negative feedback scheme is described that optimizes the relative weighting between the primary objective term and the penalty function terms in the cost function. A placement refinement method has been developed for rectilinear cells which spaces the cells at a density which avoids the need for post-routing compaction. In addition, a detailed routing method has been developed which avoids the classically difficult problem of defining channels for detailed routing. The result for the ami33 benchmark circuit is better than the previously published results. >

Journal ArticleDOI
TL;DR: A discussion is presented of two ways of mapping the cells in a two-dimensional area of a chip onto processors in an n-dimensional hypercube such that both small and large cell moves can be applied.
Abstract: A discussion is presented of two ways of mapping the cells in a two-dimensional area of a chip onto processors in an n-dimensional hypercube such that both small and large cell moves can be applied. Two types of move are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described, along with a distributed data structure that needs to be stored in the hypercube to support such a parallel cost evaluation. A novel tree broadcasting strategy is presented for the hypercube that is used extensively in the algorithm for updating cell locations in the parallel environment. A dynamic parallel annealing schedule is proposed that estimates the errors due to interacting parallel moves and adapts the rate of synchronization automatically. Two novel approaches in controlling error in parallel algorithms are described: heuristic cell coloring and adaptive sequence control. The performance on an Intel iPSC-2/D4/MX hypercube is reported. >