scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 1997"


Journal ArticleDOI
TL;DR: The results show that the ACS outperforms other nature-inspired algorithms such as simulated annealing and evolutionary computation, and it is concluded comparing ACS-3-opt, a version of the ACS augmented with a local search procedure, to some of the best performing algorithms for symmetric and asymmetric TSPs.
Abstract: This paper introduces the ant colony system (ACS), a distributed algorithm that is applied to the traveling salesman problem (TSP). In the ACS, a set of cooperating agents called ants cooperate to find good solutions to TSPs. Ants cooperate using an indirect form of communication mediated by a pheromone they deposit on the edges of the TSP graph while building solutions. We study the ACS by running experiments to understand its operation. The results show that the ACS outperforms other nature-inspired algorithms such as simulated annealing and evolutionary computation, and we conclude comparing ACS-3-opt, a version of the ACS augmented with a local search procedure, to some of the best performing algorithms for symmetric and asymmetric TSPs.

7,596 citations


Journal ArticleDOI
TL;DR: The effects of multiple sequence information and different types of conformational constraints on the overall performance of the method are investigated, and the ability of a variety of recently developed scoring functions to recognize the native-like conformations in the ensembles of simulated structures are investigated.

1,437 citations


01 Jan 1997
TL;DR: It turns out that the new rank based ant system can compete with the other methods in terms of average behavior, and shows even better worst case behavior.
Abstract: The ant system is a new meta-heuristic for hard combinatorial optimization problems. It is a population-based approach that uses exploitation of positive feedback as well as greedy search. It was first proposed for tackling the well known Traveling Salesman Problem (TSP), but has been also successfully applied to problems such as quadratic assignment, job-shop scheduling, vehicle routing and graph coloring.In this paper we introduce a new rank based version of the ant system and present results of a computational study, where we compare the ant system with simulated annealing and a genetic algorithm on several TSP instances. It turns out that our rank based ant system can compete with the other methods in terms of average behavior, and shows even better worst case behavior. (author's abstract)

881 citations


Journal ArticleDOI
TL;DR: In this article, the combinatorial optimization simulated annealing algorithm is applied to the analysis of Rutherford backscattering data, which is fully automatic and does not require time-consuming human intervention.
Abstract: The combinatorial optimization simulated annealing algorithm is applied to the analysis of Rutherford backscattering data. The analysis is fully automatic, i.e., it does not require time-consuming human intervention. The algorithm is tested on a complex iron-cobalt silicide spectrum, and all the relevant features are successfully determined. The total analysis time using a PC 486 processor running at 100 MHz is comparable to the data collection time, which opens the way for on-line automatic analysis.

587 citations


Journal ArticleDOI
TL;DR: A deterministic annealing approach to pairwise clustering is described which shares the robustness properties of maximum entropy inference and the resulting Gibbs probability distributions are estimated by mean-field approximation.
Abstract: Partitioning a data set and extracting hidden structure from the data arises in different application areas of pattern recognition, speech and image processing. Pairwise data clustering is a combinatorial optimization method for data grouping which extracts hidden structure from proximity data. We describe a deterministic annealing approach to pairwise clustering which shares the robustness properties of maximum entropy inference. The resulting Gibbs probability distributions are estimated by mean-field approximation. A new structure-preserving algorithm to cluster dissimilarity data and to simultaneously embed these data in a Euclidian vector space is discussed which can be used for dimensionality reduction and data visualization. The suggested embedding algorithm which outperforms conventional approaches has been implemented to analyze dissimilarity data from protein analysis and from linguistics. The algorithm for pairwise data clustering is used to segment textured images.

524 citations


Journal ArticleDOI
TL;DR: The near-optimality, speed and simplicity of heuristic algorithms suggests that they are acceptable alternatives for many reserve selection problems, especially when dealing with large data sets or complicated analyses.

456 citations


Book
01 Oct 1997
TL;DR: A comparison with Existing Approaches, a New Method for Cloud Removal, and Experimental Results of the Exhaustive Search Algorithm: Designing Optimal Sensor Systems within Dependability Bounds.
Abstract: I. INTRODUCTION TO SENSOR FUSION. 1. Introduction. Importance. Sensor Processes. Applications. Summary. Problem Set 1. II. FOUNDATIONS OF SENSOR FUSION. 2. Sensors. Mathematical Description. Use of Multiple Sensors. Construction of Reliable Abstract Sensors From Simple Abstract Sensors. Static and Dynamic Networks. Conclusion. Problem Set 2. 3. Mathematical Tools Used. Algorithms. Linear Algebra. Coordinate Transformations. Rigid Body Motion. Probability. Dependability and Markov Chains. Gaussian Noise. Meta-Heuristics. Summary. Problem Set 3. 4. High-Performance Data Structures: CAD Based. Boundary Representations. Sweep Presentation. CSG - Constructive Solid Geometry. Wire-Frame Models and the Wing-Edge Data Structure. Surface Patches and Contours. Generalized Cylinders. Summary. Problem Set 4. 5. High-Performance Data Structures: Tessellated. Sparse Arrays. Simplex Grids of Non-Uniform Sizes. Grayscale and Color Arrays. Occupancy Grids and HIMM Histogram Maps. Summary. Problem Set 5. 6. High-Performance Data Structures: Trees, and Graphs. 2n Trees. Forest of Quadtrees. Translation Invariant Data Structure. Multi-Dimensional Trees. Graphs of Free Space. Description Trees of Polygons. Range and Interval Trees. Summary. Problem Set 6. 7. High-Performance Data Structures: Functions. Interpolation. Least Squares Estimation. Splines. Bezier Curves and Bi-Cubic Patches. Fourier Transform, Cepstrum and Wavelets. Modal Representation. Summary. Problem Set 7. 8. Representing Ranges and Uncertainty in Data Structures. Explicit Accuracy Bounds. Probability and Dempster-Shafer Methods. Statistics. Fuzzy Sets. Summary. Problem Set 8. III. APPLICATIONS TO SENSOR FUSION. 9. Image Registration for Sensor Fusion. Image Registration Techniques. Problem Statement. Fitness Function. Tabu Search. Genetic Algorithms. Simulated Annealing. Results. Summary. 10. Designing Optimal Sensor Systems within Dependability Bounds. Applications. Dependability Measures. Optimization Model. Exhaustive Search on the Multidimensional Surface. Experimental Results of the Exhaustive Search Algorithm. Heuristic Methods. Summary. 11. Sensor Fusion and Approximate Agreement. Byzantine Generals Problem. Approximate Byzantine Matching. Fusion of Contradictory Sensor Information. Performance Comparison. Hybrid Algorithm. Example 1. Example 2. Summary. 12. Kalman Filtering Applied to a Sensor Fusion Problem. Background. A New Method. A New Technique for Cloud Removal. A Prototype System. Kalman Filter for Scenario 1. Discussion of Results. Summary. 13. Optimal Sensor Fusion Using Range Trees Recursively. Sensors. Redundancy and Associated Errors. Faulty Sensor Averaging Problem. Interval Trees. Algorithm to Find the Optimal Region. Algorithm Complexity. Comparison with Known Methods. Summary. 14. Distributed Dynamic Sensor Fusion. Problem Description. New Paradigm for Distributed Dynamic Sensor Fusion. Robust Agreement Using the Optimal Region. A Comparison with Existing Approaches. Experimental Results. Summary. IV. CASE STUDIES AND CONCLUSION. 15. Sensor Fusion Case Studies. Levels of Sensor Fusion. Types of Sensors Available. Research Trends. Case Studies. Summary. 16. Conclusion. Review. Conclusion. Appendix A. Program Source Code. References. Index483.

364 citations


Journal ArticleDOI
TL;DR: It is shown that the combination of maximum likelihood with cross-validation, which reduces overfitting, and simulated annealing by torsion angle molecular dynamics, which simplifies the conformational search problem, results in a major improvement of the radius of convergence of refinement and the accuracy of the refined structure.
Abstract: Recently, the target function for crystallographic refinement has been improved through a maximum likelihood analysis, which makes proper allowance for the effects of data quality, model errors, and incompleteness. The maximum likelihood target reduces the significance of false local minima during the refinement process, but it does not completely eliminate them, necessitating the use of stochastic optimization methods such as simulated annealing for poor initial models. It is shown that the combination of maximum likelihood with cross-validation, which reduces overfitting, and simulated annealing by torsion angle molecular dynamics, which simplifies the conformational search problem, results in a major improvement of the radius of convergence of refinement and the accuracy of the refined structure. Torsion angle molecular dynamics and the maximum likelihood target function interact synergistically, the combination of both methods being significantly more powerful than each method individually. This is demonstrated in realistic test cases at two typical minimum Bragg spacings (dmin = 2.0 and 2.8 A, respectively), illustrating the broad applicability of the combined method. In an application to the refinement of a new crystal structure, the combined method automatically corrected a mistraced loop in a poor initial model, moving the backbone by 4 A.

336 citations


Journal ArticleDOI
TL;DR: Two heuristics for hardware/software partitioning, formulated as a graph partitioning problem, are presented: one based on simulated annealing and the other on tabu search, and results show the clear superiority of thetabu search based algorithm.
Abstract: This paper presents two heuristics for automatic hardware/software partitioning of system level specifications. Partitioning is performed at the granularity of blocks, loops, subprograms, and processes with the objective of performance optimization with a limited hardware and software cost. We define the metric values for partitioning and develop a cost function that guides partitioning towards the desired objective. We consider minimization of communication cost and improvement of the overall parallelism as essential criteria during partitioning. Two heuristics for hardware/software partitioning, formulated as a graph partitioning problem, are presented: one based on simulated annealing and the other on tabu search. Results of extensive experiments, including real-life examples, show the clear superiority of the tabu search based algorithm.

288 citations


Journal ArticleDOI
TL;DR: A three phase heuristic is presented for minimizing the sum of the weighted tardinesses in a simulated annealing procedure applied starting from a seed solution which is the result of the second phase.

273 citations


Journal ArticleDOI
01 Aug 1997
TL;DR: The technique of simulated annealing is used to drive the reconfiguration process with configuration metrics as cost functions, and concepts of distance between metamorphic robot configurations are defined, and shown to satisfy the formal properties of a metric.
Abstract: In this paper the problem of dynamic self-reconfiguration of a class of modular robotic systems referred to as metamorphic systems is examined. A metamorphic robotic system is a collection of mechatronic modules, each of which has the ability to connect, disconnect, and climb over adjacent modules. We examine the near-optimal reconfiguration of a metamorphic robot from an arbitrary initial configuration to a desired final configuration. Concepts of distance between metamorphic robot configurations are defined, and shown to satisfy the formal properties of a metric. These metrics, called configuration metrics, are then applied to the automatic self-reconfiguration of metamorphic systems in the case when one module is allowed to move at a time. There is no simple method for computing the optimal sequence of moves required to reconfigure. As a result, heuristics which can give a near optimal solution must be used. We use the technique of simulated annealing to drive the reconfiguration process with configuration metrics as cost functions. The relative performance of simulated annealing with different cost functions is compared and the usefulness of the metrics developed in this paper is demonstrated.

Journal ArticleDOI
TL;DR: In this article, the authors present an alternative approach to generate realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method.
Abstract: Generating one realization of a random permeability field that is consistent with observed pressure data and a known variogram model is not a difficult problem. If, however, one wants to investigate the uncertainty of reservior behavior, one must generate a large number of realizations and ensure that the distribution of realizations properly reflects the uncertainty in reservoir properties. The most widely used method for conditioning permeability fields to production data has been the method of simulated annealing, in which practitioners attempt to minimize the difference between the ’ ’true and simulated production data, and “true” and simulated variograms. Unfortunately, the meaning of the resulting realization is not clear and the method can be extremely slow. In this paper, we present an alternative approach to generating realizations that are conditional to pressure data, focusing on the distribution of realizations and on the efficiency of the method. Under certain conditions that can be verified easily, the Markov chain Monte Carlo method is known to produce states whose frequencies of appearance correspond to a given probability distribution, so we use this method to generate the realizations. To make the method more efficient, we perturb the states in such a way that the variogram is satisfied automatically and the pressure data are approximately matched at every step. These perturbations make use of sensitivity coefficients calculated from the reservoir simulator.

Journal ArticleDOI
TL;DR: Overall, tabu search tends to give the most robust results closely followed by simulated annealing, and genetic algorithms do not generally perform well for these types of problems, except when very few candidate solutions may be evaluated because of large computing requirements.

Journal ArticleDOI
TL;DR: A new global optimization algorithm for functions of many continuous variables is presented, derived from the basic Simulated annealing method, and used to solve complex circuit design problems, for which the objective function evaluation can be exceedingly costly.
Abstract: A new global optimization algorithm for functions of many continuous variables is presented, derived from the basic Simulated annealing method. Our main contribution lies in dealing with high-dimensionality minimization problems, which are often difficult to solve by all known minimization methods with or without gradient. In this article we take a special interest in the variables discretization issue. We also develop and implement several complementary stopping criteria. The original Metropolis iterative random search, which takes place in a Euclidean space Rn, is replaced by another similar exploration, performed within a succession of Euclidean spaces Rp, with p

Journal ArticleDOI
TL;DR: Four probabilistic search algorithms are compared: shuffled complex evolution (SCE), genetic algorithm using traditional crossover, and multiple random start using either simplex or quasi‐Newton local searches; the SCE algorithm was found to be robust and the most efficient.
Abstract: The estimation of catchment model parameters has proven to be a difficult task for several reasons, which include ill-posedness and the existence of multiple local optima. Recent work on global probabilistic search methods has developed robust techniques for locating the global optimum. However, these methods can be computationally intensive when the search is conducted over a large hypercube. Moreover, specification of the hypercube may be problematic, particularly if there is strong parameter interaction. This study seeks to reduce the computational effort by confining the search to a subspace within which the global optimum is likely to be found. The approach involves locating a local optimum using a local gradient-based search. It is assumed that the local optimum belongs to a set of optima which cluster about the global optimum. A probabilistic search is then conducted within a hyperellipsoid defined by the second-order approximation to the response surface around the local optimum. A case study involving a five-parameter conceptual rainfall-runoff model is presented. The response surface is shown to be riddled with local optima, yet the second-order approximation provides a not unreasonable description of parameter uncertainty. The subspace search strategy provides a rational means for defining the search space and is shown to be more efficient (typically twice, but up to 5 times more efficient) than a search over a hypercube. Four probabilistic search algorithms are compared: shuffled complex evolution (SCE), genetic algorithm using traditional crossover, and multiple random start using either simplex or quasi-Newton local searches. In the case study the SCE algorithm was found to be robust and the most efficient. The genetic algorithm, although displaying initial convergence rates superior to the SCE algorithm, tended to flounder near the optimum and could not be relied upon to locate the global optimum.

Journal ArticleDOI
TL;DR: The M-SIMPSA algorithm, which does not require feasible initial points or any problem decomposition, was tested with several functions published in the literature, and results were compared with those obtained with a robust adaptive random search method.

Journal ArticleDOI
TL;DR: The PSA algorithm proposed in the paper has shown significant improvements in solution quality for the largest of the test networks, and the conditions under which the parallel algorithm is most efficient are investigated.
Abstract: The simulated annealing optimization technique has been successfully applied to a number of electrical engineering problems, including transmission system expansion planning. The method is general in the sense that it does not assume any particular property of the problem being solved, such as linearity or convexity. Moreover, it has the ability to provide solutions arbitrarily close to an optimum (i.e. it is asymptotically convergent) as the cooling process slows down. The drawback of the approach is the computational burden: finding optimal solutions may be extremely expensive in some cases. This paper presents a parallel simulated annealing (PSA) algorithm for solving the long-term transmission network expansion planning problem. A strategy that does not affect the basic convergence properties of the sequential simulated annealing algorithm have been implemented and tested. The paper investigates the conditions under which the parallel algorithm is most efficient. The parallel implementations have been tested on three example networks: a small 6-bus network; and two complex real-life networks. Excellent results are reported in the test section of the paper: in addition to reductions in computing times, the PSA algorithm proposed in the paper has shown significant improvements in solution quality for the largest of the test networks.

Book
31 Oct 1997
TL;DR: This book discusses Algorithm Complexity, Randomness, Chaos, and Fractals, and the Basics of Fuzzy Logic, which is an Application to Computational Statistics.
Abstract: Preface. 1. Algorithm Complexity: Two Simple Examples. 2. Solving General Linear Functional Equations: An Application to Algorithm Complexity. 3. Program Testing: A Problem. 4. Optimal Program Testing. 5. Optimal Choice of a Penalty Function: Simplest Case of Algorithm Design. 6. Solving General Linear Differential Equations with Constant Coefficients: An Application to Constrained Optimization. 7. Simulated Annealing: 'Smooth' (Local) Discrete Optimization. 8. Genetic Algorithms: 'Non-Smooth' Discrete Optimization. 9. RISC Computer Architecture and Internet Growth: Two Applications of Extrapolation. 10. Systems of Differential Equations and Their Use in Computer-Related Extrapolation Problems. 11. Network Congestion: An Example of Non-Linear Extrapolation. 12. Neural Networks: A General Form of Non-Linear Extrapolation. 13. Expert Systems and the Basics of Fuzzy Logic. 14. Intelligent and Fuzzy Control. 15. Randomness, Chaos, and Fractals. A: Simulated Annealing Revisited. B: Software Cost Estimation. C: Electronic Engineering: How to Describe PN-Junctions. D: Log-Normal Distribution Justified: An Application to Computational Statistics. E: Optimal Robust Statistical Methods. F: How to Avoid Paralysis of Neural Networks. G: Estimating Computer Prices. H: Allocating Bandwidth on Computer Networks. I: Algorithm Complexity Revisited. J: How Can a Robot Avoid Obstacles: Case Study of Real-Time Optimization. K: Discounting in Robot Control: A Case Study of Dynamic Optimization. Index.

Journal ArticleDOI
TL;DR: This paper presents an effective objective function based on Fourier descriptors that evaluates only the shape differences between two curves that discovers near-global and practical solutions consistently without requiring any initial guess.
Abstract: Generally, success in synthesis of mechanisms for path generation is limited to finding a reasonable local optima at best in spite of verygood initial guess. The most widely used Structural Error objective function is not effective in leading to practical solutions as it misrepresents the nature of the design problem by requiring the shape, size, orientation and position of the coupler curve to be optimized all at once. In this paper, we present an effective objective function based on Fourier descriptors that evaluates only the shape differences between two curves. This function is first minimized using a stochastic global search method derived from simulated annealing followed by Powell's method. The size, orientation and position of the desired curve are addressed in a later stage by determining analogous points on the desired and candidate curves. In spite of highly non-linear mechanisms design space, our method discovers near-global and practical solutions consistently without requiring any initial guess.

Journal ArticleDOI
TL;DR: This paper casts the optimisation process into a Bayesian framework by exploiting the recently reported global consistency measure of Wilson and Hancock as a fitness measure, and demonstrates empirically that the method possesses polynomial convergence time and that the convergence rate is more rapid than simulated annealing.

Journal ArticleDOI
TL;DR: In this article, an improved nonequilibrium simulated-annealing (I-NESA) technique was applied to find: (1) the global optimum of system cost of two kinds of complex systems subject to constraints on system reliability, and (2) the optimum number of redundancies which maximize the system reliability.
Abstract: This paper applies an improved nonequilibrium simulated-annealing (I-NESA) technique to find: (1) the global optimum of system cost of two kinds of complex systems subject to constraints on system reliability, and (2) the optimum number of redundancies which maximize the system reliability, subject to constraints on system cost, weight, and volume in a multistage mixed system. The efficacy of I-NESA in solving both varieties of problems is demonstrated by comparing its results with those of simulated annealing (SA). I-NESA, using the Glauber algorithm and an exponential cooling schedule, provides a stable global solution to all the problems considered. The essential features of I-NESA, (1) the nonequilibrium concept while coming out of an inner iteration, and (2) incorporation of the simplex-like heuristic, make it very fast and stable in obtaining the global solution when compared to the traditional SA. Fast convergence was observed in all the problems studied. I-NESA is a useful alternative to either indirect optimization methods or to some random search techniques, in solving problems like those in this paper.

Journal ArticleDOI
TL;DR: The generalized simulated annealing algorithm is tested and developed and it is believed that the GSA algorithm is a powerful method to find the global minimum in more realistic problems, like the equilibrium structure of big clusters.

Journal ArticleDOI
TL;DR: The algorithm proposed provides a basis for exploring the integration of the simulated annealing technique with artificial intelligence, and interval algebra, and the possibility of placing different sizes of small rectangles or boxes on a larger rectangle (pallet) or container.

Journal ArticleDOI
TL;DR: In this paper, a method of solving a large scale long-term thermal generating unit maintenance scheduling problem is described, which combines GA, simulated annealing (SA), and tabu search (TS) algorithms.
Abstract: This paper describes a method of solving a large scale long-term thermal generating unit maintenance scheduling problem. In the solution algorithm, the genetic algorithm (GA), simulated annealing (SA) and the tabu search (TS) method are used cooperatively. The solution algorithm keeps the advantages of the individual algorithms and shows a reasonable combination of local and global searches. The method takes the maintenance class and several consecutive years scheduling into consideration. Several real-scale numerical examples demonstrate the effectiveness of the proposed method.

Journal ArticleDOI
TL;DR: A fast local search (FLS) algorithm which helps to improve the efficiency of hill climbing and a guided local search algorithm which was developed to help local search to escape local optima and distribute search effort are reported.

Journal ArticleDOI
TL;DR: A hybrid approach is proposed which presents performances which are far better than the ones obtained with any of these approaches individually and is presented in an integrated view of these methodologies.
Abstract: We have investigated and extensively tested three families of nonconvex optimization approaches for solving the transmission network expansion planning problem: simulated annealing (SA), genetic algorithms (GA), and tabu search algorithms (TS). The paper compares the main features of the three approaches and presents an integrated view of these methodologies. A hybrid approach is then proposed which presents performances which are far better than the ones obtained with any of these approaches individually. Results obtained in tests performed with large scale real-life networks are summarized.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the performance of three probabilistic optimization techniques for calibrating the Tank model, a hydrologic model typical of CRR models, and found that the SCE method provided better estimates of the optimal solution than the GA and SA methods.

Book ChapterDOI
TL;DR: This chapter discusses crystallographic refinement, a technique aimed at optimizing the agreement of an atomic model with both observed diffraction data and chemical restraints, at the level of individual atoms in the crystal structure.
Abstract: Publisher Summary This chapter discusses crystallographic refinement, a technique aimed at optimizing the agreement of an atomic model with both observed diffraction data and chemical restraints. Optimization problems in macromolecular crystallography generally suffer from there being multiple minima, which arise largely from the high dimensionality of the parameter space. The many local minima of the target function tend to defeat gradient descent optimization techniques, such as conjugate gradient or least-squares methods. These methods are simply not capable of shifting the atomic coordinates enough to correct errors in the initial model. Simulated annealing has improved the efficiency of crystallographic refinement significantly. However, simulated annealing refinement alone is still insufficient to refine a crystal structure automatically without human intervention. With currently available computing power, tedious manual adjustments, using computer graphics to display and move positions of atoms of the model in the electron-density maps, can represent the rate-limiting step in the refinement process.

Journal ArticleDOI
TL;DR: An image segmentation technique in which an arbitrarily shaped contour was deformed stochastically until it fitted around an object of interest until it settled into the global minimum of an image-derived "energy" function.
Abstract: This paper describes an image segmentation technique in which an arbitrarily shaped contour was deformed stochastically until it fitted around an object of interest. The evolution of the contour was controlled by a simulated annealing process which caused the contour to settle into the global minimum of an image-derived "energy" function. The nonparametric energy function was derived from the statistical properties of previously segmented images, thereby incorporating prior experience. Since the method was based on a state space search for the contour with the best global properties, it was stable in the presence of image errors which confound segmentation techniques based on local criteria, such as connectivity. Unlike "snakes" and other active contour approaches, the new method could handle arbitrarily irregular contours in which each interpixel crack represented an independent degree of freedom. Furthermore, since the contour evolved toward the global minimum of the energy, the method was more suitable for fully automatic applications than the snake algorithm, which frequently has to be reinitialized when the contour becomes trapped in local energy minima. High computational complexity was avoided by efficiently introducing a random local perturbation in a time independent of contour length, providing control over the size of the perturbation, and assuring that resulting shape changes were unbiased. The method was illustrated by using it to find the brain surface in magnetic resonance head images and to track blood vessels in angiograms.

Journal ArticleDOI
TL;DR: The most recent developments regardingsimulated annealing and genetic algorithms for solving facility layout problems approximately are reviewed.
Abstract: The facility layout problem (FLP) has many practical applications and is known to be NP-hard. During recent decades exact and heuristic approaches have been proposed in the literature to solve FLPs. In this paper we review the most recent developments regarding simulated annealing and genetic algorithms for solving facility layout problems approximately.