scispace - formally typeset
Search or ask a question
Topic

Extremal optimization

About: Extremal optimization is a research topic. Over the lifetime, 1168 publications have been published within this topic receiving 104943 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: Numerical results demonstrate that extremal optimization maintains consistent accuracy for increasing system sizes, with an approximation error decreasing over run time roughly as a power law t(-0.4).
Abstract: Extremal optimization is a new general-purpose method for approximating solutions to hard optimization problems. We study the method in detail by way of the computationally hard ~NP-hard! graph partitioning problem. We discuss the scaling behavior of extremal optimization, focusing on the convergence of the average run as a function of run time and system size. The method has a single free parameter, which we determine numerically and justify using a simple argument. On random graphs, our numerical results demonstrate that extremal optimization maintains consistent accuracy for increasing system sizes, with an approximation error decreasing over run time roughly as a power law t 20.4 . On geometrically structured graphs, the scaling of results from the average run suggests that these are far from optimal with large fluctuations between individual trials. But when only the best runs are considered, results consistent with theoretical arguments are recovered.

148 citations

Book
01 Jul 2004
TL;DR: The Potts Model is used for Solving Hard Max-cut Problems and Finding Low-energy Configurations and for Computing the Potts Free Energy and Submodular Functions, both of which have applications in physics and engineering.
Abstract: List of Contributors. 1 Introduction (A.K. Hartmann and H. Rieger). Part I: Applications in Physics. 2 Cluster Monte Carlo Algorithms (W. Krauth). 2.1 Detailed Balance and a priori Probabilities. 2.2 The Wolff Cluster Algorithm for the Ising Model. 2.3 Cluster Algorithm for Hard Spheres and Related Systems. 2.4 Applications. 2.4.1 Phase Separation in Binary Mixtures. 2.4.2 Polydisperse Mixtures. 2.4.3 Monomer-Dimer Problem. 2.5 Limitations and Extensions. References. 3 Probing Spin Glasses with Heuristic Optimization Algorithms (O.C. Martin). 3.1 Spin Glasses. 3.1.1 Motivations. 3.1.2 The Ising Model. 3.1.3 Models of Spin Glasses. 3.1.4 Some Challenges. 3.2 Some Heuristic Algorithms. 3.2.1 General Issues. 3.2.2 Variable Depth Search. 3.2.3 Genetic Renormalization Algorithm. 3.3 A survey of Physics Results. 3.3.1 Convergence of the Ground-state Energy Density. 3.3.2 Domain Walls. 3.3.3 Clustering of Ground States. 3.3.4 Low-energy Excitations. 3.3.5 Phase Diagram. 3.4 Outlook. References. 4 Computing Exact Ground States of Hard Ising Spin Glass Problems by Branch-and-cut (F. Liers, M. Junger, G. Reinelt, and G. Rinaldi). 4.1 Introduction. 4.2 Ground States and Maximum Cuts. 4.3 A General Scheme for Solving Hard Max-cut Problems. 4.4 Linear Programming Relaxations of Max-cut. 4.5 Branch-and-cut. 4.6 Results of Exact Ground-state Computations. 4.7 Advantages of Branch-and-cut. 4.8 Challenges for the Years to Come. References. 5 Counting States and Counting Operations (A. Alan Middleton). 5.1 Introduction. 5.2 Physical Questions about Ground States. 5.2.1 Homogeneous Models. 5.2.2 Magnets with Frozen Disorder. 5.3 Finding Low-energy Configurations. 5.3.1 Physically Motivated Approaches. 5.3.2 Combinatorial Optimization. 5.3.3 Ground-state Algorithm for the RFIM. 5.4 The Energy Landscape: Degeneracy and Barriers. 5.5 Counting States. 5.5.1 Ground-state Configuration Degeneracy. 5.5.2 Thermodynamic State. 5.5.3 Numerical Studies of Zero-temperature States. 5.6 Running Times for Optimization Algorithms. 5.6.1 Running Times and Evolution of the Heights. 5.6.2 Heuristic Derivation of Running Times. 5.7 Further Directions. References. 6 Computing the Potts Free Energy and Submodular Functions (J.-C. Angles d'Auriac). 6.1 Introduction. 6.2 The Potts Model. 6.2.1 Definition of the Potts Model. 6.2.2 Some Results for Non-random Models. 6.2.3 The Ferromagnetic Random Bond Potts Model. 6.2.4 High Temperature Development. 6.2.5 Limit of an Infinite Number of States. 6.3 Basics on the Minimization of Submodular Functions. 6.3.1 Definition of Submodular Functions. 6.3.2 A simple Characterization. 6.3.3 Examples. 6.3.4 Minimization of Submodular Function. 6.4 Free Energy of the Potts Model in the Infinite q-Limit. 6.4.1 The Method. 6.4.2 The Auxiliary Problem. 6.4.3 The Max-flow Problem: the Goldberg and Tarjan Algorithm. 6.4.4 About the Structure of the Optimal Sets. 6.5 Implementation and Evaluation. 6.5.1 Implementation. 6.5.2 Example of Application. 6.5.3 Evaluation of the CPU Time. 6.5.4 Memory Requirement. 6.5.5 Various Possible Improvements. 6.6 Conclusion. References. Part II Phase Transitions in Combinatorial Optimization Problems. 7 The Random 3-satisfiability Problem: From the Phase Transition to the Efficient Generation of Hard, but Satisfiable Problem Instances (M. Weigt). 7.1 Introduction. 7.2 Random 3-SAT and the SAT/UNSAT Transition. 7.2.1 Numerical Results. 7.2.2 Using Statistical Mechanics. 7.3 Satisfiable Random 3-SAT Instances. 7.3.1 The Naive Generator. 7.3.2 Unbiased Generators. 7.4 Conclusion. References. 8 Analysis of Backtracking Procedures for Random Decision Problems (S. Cocco, L. Ein-Dor, and R. Monasson). 8.1 Introduction. 8.2 Phase Diagram, Search Trajectories and the Easy SAT Phase. 8.2.1 Overview of Concepts Useful to DPLL Analysis. 8.2.2 Clause Populations: Flows, Averages and Fluctuations. 8.2.3 Average-case Analysis in the Absence of Backtracking. 8.2.4 Occurrence of Contradictions and Polynomial SAT Phase. 8.3 Analysis of the Search Tree Growth in the UNSAT Phase. 8.3.1 Numerical Experiments. 8.3.2 Parallel Growth Process and Markovian Evolution Matrix. 8.3.3 Generating Function and Large-size Scaling. 8.3.4 Interpretation in Terms of Growth Process. 8.4 Hard SAT Phase: Average Case and Fluctuations. 8.4.1 Mixed Branch and Tree Trajectories. 8.4.2 Distribution of Running Times. 8.4.3 Large Deviation Analysis of the First Branch in the Tree. 8.5 The Random Graph Coloring Problem. 8.5.1 Description of DPLL Algorithm for Coloring. 8.5.2 Coloring in the Absence of Backtracking. 8.5.3 Coloring in the Presence of Massive Backtracking. 8.6 Conclusions. References. 9 New Iterative Algorithms for Hard Combinatorial Problems (R. Zecchina). 9.1 Introduction. 9.2 Combinatorial Decision Problems, K-SAT and the Factor Graph Representation. 9.2.1 Random K-SAT. 9.3 Growth Process Algorithm: Probabilities, Messages and Their Statistics. 9.4 Traditional Message-passing Algorithm: Belief Propagation as Simple Cavity Equations. 9.5 Survey Propagation Equations. 9.6 Decimating Variables According to Their Statistical Bias. 9.7 Conclusions and Perspectives. References. Part III New Heuristics and Interdisciplinary Applications. 10 Hysteretic Optimization ((K.F. Pal). 10.1 Hysteretic Optimization for Ising Spin Glasses. 10.2 Generalization to Other Optimization Problems. 10.3 Application to the Traveling Salesman Problem. 10.4 Outlook. References. 11 Extremal Optimization (S. Boettcher) 227 11.1 Emerging Optimality. 11.2 Extremal Optimization. 11.2.1 Basic Notions. 11.2.2 EO Algorithm. 11.2.3 Extremal Selection. 11.2.4 Rank Ordering. 11.2.5 Defining Fitness. 11.2.6 Distinguishing EO from other Heuristics. 11.2.7 Implementing EO. 11.3 Numerical Results for EO. 11.3.1 Early Results. 11.3.2 Applications of EO by Others. 11.3.3 Large-scale Simulations of Spin Glasses. 11.4 Theoretical Investigations. References. 12 Sequence Alignments (A.K. Hartmann) 253 12.1 Molecular Biology. 12.2 Alignments and Alignment Algorithms. 12.3 Low-probability Tail of Alignment Scores. References. 13 Protein Folding in Silico - the Quest for Better Algorithms (U.H.E. Hansmann). 13.1 Introduction. 13.2 Energy Landscape Paving. 13.3 Beyond Global Optimization. 13.3.1 Parallel Tempering. 13.3.2 Multicanonical Sampling and Other Generalized-ensemble Techniques. 13.4 Results. 13.4.1 Helix Formation and Folding. 13.4.2 Structure Predictions of Small Proteins. 13.5 Conclusion. References. Index.

147 citations

Proceedings ArticleDOI
14 Sep 2004
TL;DR: In this paper, a modified particle swarm optimization (PSO) algorithm was proposed to solve a typical combinatorial optimization problem: traveling salesman problem (TSP), which is a well-known NP-hard problem.
Abstract: Particle swarm optimization, as an evolutionary computing technique, has succeeded in many continuous problems, but research on discrete problems especially combinatorial optimization problem has been done little according to Kennedy and Eberhart (1997) and Mohan and Al-kazemi (2001). In this paper, a modified particle swarm optimization (PSO) algorithm was proposed to solve a typical combinatorial optimization problem: traveling salesman problem (TSP), which is a well-known NP-hard problem. Fuzzy matrices were used to represent the position and velocity of the particles in PSO and the operators in the original PSO formulas were redefined. Then the algorithm was tested with concrete examples in TSPLIB, experiment shows that the algorithm can achieve good results.

147 citations

Journal ArticleDOI
TL;DR: An ant colony optimization framework has been compared and shown to be a viable alternative approach to other stochastic search algorithms and can be successfully used for large-scale process optimization.
Abstract: An ant colony optimization framework has been compared and shown to be a viable alternative approach to other stochastic search algorithms. The algorithm has been tested for variety of different benchmark test functions involving constrained and unconstrained NLP, MILP, and MINLP optimization problems. This novel algorithm handles different types of continuous functions very well and can be successfully used for large-scale process optimization.

147 citations

01 Jan 1997
TL;DR: The computational results show that this algorithm can be used to efficiently find near-optimal solutions to hard combinatorial optimization problems and that it is one of the best methods for the solution of structured quadratic assignment problems.

141 citations


Network Information
Related Topics (5)
Genetic algorithm
67.5K papers, 1.2M citations
85% related
Optimization problem
96.4K papers, 2.1M citations
81% related
Artificial neural network
207K papers, 4.5M citations
80% related
Cluster analysis
146.5K papers, 2.9M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20232
202213
20217
20209
201922
201815