scispace - formally typeset
Search or ask a question

Showing papers on "Extremal optimization published in 2001"


Journal ArticleDOI
01 Feb 2001
TL;DR: A new heuristic algorithm, mimicking the improvisation of music players, has been developed and named Harmony Search (HS), which is illustrated with a traveling salesman problem (TSP), a specific academic optimization problem, and a least-cost pipe network design problem.
Abstract: Many optimization problems in various fields have been solved using diverse optimization al gorithms. Traditional optimization techniques such as linear programming (LP), non-linear programming (NL...

5,136 citations


Journal ArticleDOI
TL;DR: This work uses extremal optimization to elucidate the phase transition in the 3-coloring problem, and provides independent confirmation of previously reported extrapolations for the ground-state energy of +/-J spin glasses in d = 3 and 4.
Abstract: We explore a new general-purpose heuristic for finding high-quality solutions to hard discrete optimization problems. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. Extremal optimization successively updates extremely undesirable variables of a single suboptimal solution, assigning them new, random values. Large fluctuations ensue, efficiently exploring many local optima. We use extremal optimization to elucidate the phase transition in the 3-coloring problem, and we provide independent confirmation of previously reported extrapolations for the ground-state energy of {+-}J spin glasses in d=3 and 4 .

300 citations


Journal ArticleDOI
TL;DR: Numerical results demonstrate that extremal optimization maintains consistent accuracy for increasing system sizes, with an approximation error decreasing over run time roughly as a power law t(-0.4).
Abstract: Extremal optimization is a new general-purpose method for approximating solutions to hard optimization problems. We study the method in detail by way of the computationally hard ~NP-hard! graph partitioning problem. We discuss the scaling behavior of extremal optimization, focusing on the convergence of the average run as a function of run time and system size. The method has a single free parameter, which we determine numerically and justify using a simple argument. On random graphs, our numerical results demonstrate that extremal optimization maintains consistent accuracy for increasing system sizes, with an approximation error decreasing over run time roughly as a power law t 20.4 . On geometrically structured graphs, the scaling of results from the average run suggests that these are far from optimal with large fluctuations between individual trials. But when only the best runs are considered, results consistent with theoretical arguments are recovered.

148 citations


Journal ArticleDOI
TL;DR: In this paper, a rejectionless global optimization technique for low temperature relaxation of complex systems as glasses and spin glasses is discussed. But it still relies on a physical analogy with a thermalizing system.

65 citations


01 Jan 2001
TL;DR: This paper introduces model-based search as a unifying framework accommodating some recently proposed heuristics for combinatorial optimization such as ant colony optimization, stochastic gradient ascent, cross-entropy and estimation of distribution methods.
Abstract: In this paper we introduce model-based search as a unifying framework accommodating some recently proposed heuristics for combinatorial optimization such as ant colony optimization, stochastic gradient ascent, cross-entropy and estimation of distribution methods. We discuss similarities as well as distinctive features of each method and we propose some extensions.

32 citations


01 Jan 2001
TL;DR: This work presents a new distributed algorithm based on Ant System concepts, called the General Ant System, to solve Combinatorial Optimization Problems and shows that this approach has the same performances than previous versions of Ant Systems.
Abstract: An Ants System is an artificial system based on the behavior of real ant colonies, which is used to solve combinatorial problems. This is a distributed algorithm composed by a set of cooperating agents called ants which cooperate among them to find good solutions to combinatorial optimization problems. The cooperation follows the behavior of real ants using an indirect form of communication mediated by a pheromone. In this work, we present a new distributed algorithm based on Ant System concepts, called the General Ant System, to solve Combinatorial Optimization Problems. Our approach consist on mapping the solution space of the Combinatorial Optimization Problem on the space where the ants will walk, and on defining the transition probability of the Ant System according to the objective function of the Combinatorial Optimization Problem. We test our approach on the Graph Partitioning and The Traveling Salesman Problems. The results show that our approach has the same performances than previous versions of Ant Systems.

32 citations


Proceedings ArticleDOI
27 May 2001
TL;DR: It turns out that the proposed method can compete with the original ACS in terms of solution quality and computation speed for these problem.
Abstract: The ant colony system (ACS) algorithm is new metaheuristic for hard combinational optimization problems. It is a population-based approach that exploits positive feedback as well as greedy search. It was first proposed for tackling the well known traveling salesman problem (TSP). We introduce a new version of the ACS based on a dynamic weighted updating rule. Implementation to solve TSP and the performance results under various conditions are conducted, and the comparison between the original ACS and the proposed method is shown. It turns out that our proposed method can compete with the original ACS in terms of solution quality and computation speed for these problem.

21 citations


Journal ArticleDOI
TL;DR: A broad overview of several metaphor-based algorithms, including the widely-used genetic and simulated annealing algorithms, which are based on metaphors borrowed from other areas of science.
Abstract: Combinatorial optimization problems typically require every possible solution to be evaluated to ensure finding the optimal solution. Since such exhaustive searches are often impractical, there is now a vast body of heuristic algorithms for them. Among the algorithms are those based on metaphors borrowed from other areas of science. The idea is that key elements of physical processes can be used abstractly to form the basis of an optimization algorithm. This article presents a broad overview of several metaphor-based algorithms, including the widely-used genetic and simulated annealing algorithms.

19 citations


Proceedings ArticleDOI
27 May 2001
TL;DR: Through several experiments, it is confirmed that GSA works adaptively and it shows higher performance than existing methods and the paper would like to propose a superior method for function optimization.
Abstract: The paper applies a method, Genetic algorithm with Search area Adaptation (GSA), to function optimization. In a previous study (H. Someya and M. Yamamura, 1999), GSA was proposed for the floorplan design problem and it showed better performance than several existing methods. We believe that investigation of the searching behavior of the algorithm is important. However, since the floorplan design problem is a combinatorial optimization problem, we do not know in detail why GSA works well. Thus, we apply GSA to function optimization in order to study the searching behavior in detail. In the function optimization, several benchmarks have been proposed, and their optima and landscapes are known. There is another reason to apply GSA to function optimization: we would like to propose a superior method for function optimization. Through several experiments, we have confirmed that GSA works adaptively and it shows higher performance than existing methods.

15 citations


Proceedings ArticleDOI
02 Dec 2001
TL;DR: This paper introduces a new version of the ACS based on a dynamic weighted updating method and a dynamic ant number decision method using a curve-fitting algorithm that can compete with the original ACS in terms of solution quality and computation speed.
Abstract: The ant colony system (ACS) algorithm is a new meta-heuristic for hard combinational optimization problems. It is a population-based approach that uses the exploitation of positive feedback as well as a greedy search. It was first proposed for tackling the well-known traveling salesman problem (TSP). In this paper, we introduce a new version of the ACS based on a dynamic weighted updating method and a dynamic ant number decision method using a curve-fitting algorithm. An implementation to solve the TSP and performance results under various conditions are presented, and a comparison between the original ACS and the proposed method is shown. It turns out that our proposed method can compete with the original ACS in terms of solution quality and computation speed for these problems.

11 citations


Book ChapterDOI
01 Jun 2001
TL;DR: The presented approach applied to NN optimization can be regarded as a solution to the so-called competing conventions problem [18] and can be used to save fitness evaluations when used in combination with a graph-database.
Abstract: In this article we deal with a quite general topic in evolutionary structure optimization, namely, redundancy in the encoding due to isomorphic structures. This problem is well known in topology optimization of neural networks (NNs) and we study it in this framework. Choosing a good NN structure for a given problem is still a difficult task for which we do not know any successful analytical means. But problem specific structures can lead to significantly improved results; evidence for this is given in this contribution by an NN that was evolutionarily adapted for a benchmark problem. The degree to which isomorphic structures, i.e., classes of equivalent NN topologies, enlarge the search space depends on the restrictions of the allowed structures and on the representation of the search space. In the context of structure optimization of NNs we observe similar phenomena of rare and frequent structures as are known from molecular biology. To cope with isomorphisms we make use of the relation between NN topologies and graphs. Exploiting methods from graph theory we demonstrate a general way to deal with isomorphic structures. The presented approach applied to NN optimization can be regarded as a solution to the so-called competing conventions problem [18]. Further, it can be used to save fitness evaluations when used in combination with a graph-database.

Journal Article
TL;DR: This paper further extends the idea of this new biological optimization strategy to some other hard combinatorial optimization problems, including the multi attribute situation which lack of efficient solving methods.
Abstract: Ant algorithm is a newly emerged stochastic searching optimization algorithm in recent years It has been paid much attention to since the successful application in the famous travelling salesman problem This paper further extends the idea of this new biological optimization strategy to some other hard combinatorial optimization problems, including the multi attribute situation which lack of efficient solving methods The ability of optimization for the algorithm is tested experimentally which give encouraging results

Journal Article
TL;DR: In this paper, the time restraints are converted into objective restraints, and genetic algorithms with mutations by 2-exchange and by 3-ex exchange are designed by using sequence coding to deal with the soft and hard time restraints.
Abstract: As an extension of the travelling salesman problem, the travelling salesman problem with time windows is a very difficult problem with great theoretical and practical significance. In this paper, the time restraints are converted into objective restraints, and genetic algorithms with mutations by 2-exchange and by 3-exchange, respectively, are designed by using sequence coding to deal with the soft and hard time restraints. The simulation indicates that the latter algorithm is superior to the former, and that both of the proposed algorithms are superior to the simple genetic algorithm.

Journal Article
TL;DR: A novel method called Ant Colony Optimization (ACO) is presented to solve the problem of transmission network expansion planning, and the results show that this method is feasible and efficient.
Abstract: Transmission network expansion planning is a very complicated, nonlinear, large scale combinatorial optimization problem. In this paper, a novel method called Ant Colony Optimization (ACO) is presented to solve the problem of transmission network expansion planning. ACO algorithm, derived from the study of the behavior of ant colonies, is a new general purpose heuristic algorithm for hard combinatorial optimization problems. The main characteristics of this method are positive feedback, distributed computation, and the use of a constructive greedy heuristic. In this paper, the application of ACO to transmission network expansion planning is investigated, the corresponding mathematical mode is established and solution algorithms are developed. The presented method has been tested on Garver 6 system, and the results show that this method is feasible and efficient.

Journal ArticleDOI
TL;DR: Application of the heuristic self-organization method (or the so-called Group Method of Data Handling) to the solution of discrete optimization problems is considered and convergence by probability of theSelforganization process to a global optimization point is proved.
Abstract: Application of the heuristic self-organization method (or the so-called Group Method of Data Handling) to the solution of discrete optimization problems is considered. Convergence by probability of the self-organization process to a global optimization point is proved. Some examples of numerical experiments are presented.

Proceedings ArticleDOI
01 Jul 2001
TL;DR: The mathematical model of the given method allows the travelling salesman problem to be solved by using a minimum memory size of /spl sim/o(n/sup 2/+3n) cells and with a small run time.
Abstract: In this paper, the travelling salesman problem and its solution by a method of correlative regression analysis are considered. The mathematical model of the given method is considered. The method allows the problem to be solved by using a minimum memory size of /spl sim/o(n/sup 2/+3n) cells and with a small run time /spl sim/o(n).

Journal Article
TL;DR: In this article, the authors studied the dynamics of the Monte Carlo simulation algorithm in several very different optimization problems, including the traveling salesman problem (TSP), lattice version of the protein folding problem (PFP), x-ray data analysis (XDA), and a multi-variable function.

Journal ArticleDOI
TL;DR: In this article, a rejectionless global optimization technique called the waiting time method (WTM) was proposed for low temperature relaxation of complex systems such as spin glasses and spin glasses.
Abstract: We discuss a rejectionless global optimization technique which, while being technically similar to the recently introduced method of Extremal Optimization, still relies on a physical analogy with a thermalizing system. Our waiting time method (WTM) is mathematically equivalent to the usual Metropolis algorithm, but considerably more efficient at low temperatures. The WTM can be used at constant temperature or it can be combined with annealing techniques. It is especially well suited for studying the low temperature relaxation of complex systems as glasses and spin glasses. In the paper we describe the method and test it on a spin glass example by comparing its performance to Extremal Optimization.

Journal ArticleDOI
TL;DR: Extremal Optimization, a recently introduced meta-heuristic for hard optimization problems, is analyzed on a simple model of jamming in this article, motivated first by the problem of finding lowest energy configurations for a disordered spin system on a fixed-valence graph.
Abstract: Extremal Optimization, a recently introduced meta-heuristic for hard optimization problems, is analyzed on a simple model of jamming. The model is motivated first by the problem of finding lowest energy configurations for a disordered spin system on a fixed-valence graph. The numerical results for the spin system exhibit the same phenomena found in all earlier studies of extremal optimization, and our analytical results for the model reproduce many of these features.