scispace - formally typeset
Search or ask a question

Showing papers on "Metaheuristic published in 2010"


Posted Content
TL;DR: The Bat Algorithm as mentioned in this paper is based on the echolocation behavior of bats and combines the advantages of existing algorithms into the new bat algorithm to solve many tough optimization problems.
Abstract: Metaheuristic algorithms such as particle swarm optimization, firefly algorithm and harmony search are now becoming powerful methods for solving many tough optimization problems. In this paper, we propose a new metaheuristic method, the Bat Algorithm, based on the echolocation behaviour of bats. We also intend to combine the advantages of existing algorithms into the new bat algorithm. After a detailed formulation and explanation of its implementation, we will then compare the proposed algorithm with other existing algorithms, including genetic algorithms and particle swarm optimization. Simulations show that the proposed algorithm seems much superior to other algorithms, and further studies are also discussed.

3,528 citations


Book ChapterDOI
23 Apr 2010
TL;DR: The Bat Algorithm as mentioned in this paper is based on the echolocation behavior of bats and combines the advantages of existing algorithms into the new bat algorithm to solve many tough optimization problems.
Abstract: Metaheuristic algorithms such as particle swarm optimization, firefly algorithm and harmony search are now becoming powerful methods for solving many tough optimization problems. In this paper, we propose a new metaheuristic method, the Bat Algorithm, based on the echolocation behaviour of bats. We also intend to combine the advantages of existing algorithms into the new bat algorithm. After a detailed formulation and explanation of its implementation, we will then compare the proposed algorithm with other existing algorithms, including genetic algorithms and particle swarm optimization. Simulations show that the proposed algorithm seems much superior to other algorithms, and further studies are also discussed.

3,162 citations


Journal ArticleDOI
TL;DR: This paper shows how to use the recently developed firefly algorithm to solve non-linear design problems and proposes a few new test functions with either singularity or stochastic components but with known global optimality and thus they can be used to validate new optimisation algorithms.
Abstract: Modern optimisation algorithms are often metaheuristic, and they are very promising in solving NP-hard optimisation problems. In this paper, we show how to use the recently developed firefly algorithm to solve non-linear design problems. For the standard pressure vessel design optimisation, the optimal solution found by FA is far better than the best solution obtained previously in the literature. In addition, we also propose a few new test functions with either singularity or stochastic components but with known global optimality and thus they can be used to validate new optimisation algorithms. Possible topics for further research are also discussed.

1,911 citations


Posted Content
TL;DR: In this article, the authors used the Firefly Algorithm to solve nonlinear design problems and showed that the optimal solution found by FA is far better than the best solution obtained previously in literature.
Abstract: Modern optimisation algorithms are often metaheuristic, and they are very promising in solving NP-hard optimization problems. In this paper, we show how to use the recently developed Firefly Algorithm to solve nonlinear design problems. For the standard pressure vessel design optimisation, the optimal solution found by FA is far better than the best solution obtained previously in literature. In addition, we also propose a few new test functions with either singularity or stochastic components but with known global optimality, and thus they can be used to validate new optimisation algorithms. Possible topics for further research are also discussed.

1,864 citations


Journal ArticleDOI
TL;DR: This paper presents a more extensive comparison study using some standard test functions and newly designed stochastic test functions to apply the CS algorithm to solve engineering design optimisation problems, including the design of springs and welded beam structures.
Abstract: A new metaheuristic optimisation algorithm, called cuckoo search (CS), was developed recently by Yang and Deb (2009). This paper presents a more extensive comparison study using some standard test functions and newly designed stochastic test functions. We then apply the CS algorithm to solve engineering design optimisation problems, including the design of springs and welded beam structures. The optimal solutions obtained by CS are far better than the best solutions obtained by an efficient particle swarm optimiser. We will discuss the unique search features used in CS and the implications for further research.

1,339 citations


Book
06 Jul 2010
TL;DR: The author highlights key concepts and techniques for the successful application of commonly-used metaheuristc algorithms, including simulated annealing, particle swarm optimization, harmony search, and genetic algorithms.
Abstract: An accessible introduction to metaheuristics and optimization, featuring powerful and modern algorithms for application across engineering and the sciences From engineering and computer science to economics and management science, optimization is a core component for problem solving. Highlighting the latest developments that have evolved in recent years, Engineering Optimization: An Introduction with Metaheuristic Applications outlines popular metaheuristic algorithms and equips readers with the skills needed to apply these techniques to their own optimization problems. With insightful examples from various fields of study, the author highlights key concepts and techniques for the successful application of commonly-used metaheuristc algorithms, including simulated annealing, particle swarm optimization, harmony search, and genetic algorithms. The author introduces all major metaheuristic algorithms and their applications in optimization through a presentation that is organized into three succinct parts: Foundations of Optimization and Algorithms provides a brief introduction to the underlying nature of optimization and the common approaches to optimization problems, random number generation, the Monte Carlo method, and the Markov chain Monte Carlo method Metaheuristic Algorithms presents common metaheuristic algorithms in detail, including genetic algorithms, simulated annealing, ant algorithms, bee algorithms, particle swarm optimization, firefly algorithms, and harmony search Applications outlines a wide range of applications that use metaheuristic algorithms to solve challenging optimization problems with detailed implementation while also introducing various modifications used for multi-objective optimization Throughout the book, the author presents worked-out examples and real-world applications that illustrate the modern relevance of the topic. A detailed appendix features important and popular algorithms using MATLAB and Octave software packages, and a related FTP site houses MATLAB code and programs for easy implementation of the discussed techniques. In addition, references to the current literature enable readers to investigate individual algorithms and methods in greater detail. Engineering Optimization: An Introduction with Metaheuristic Applications is an excellent book for courses on optimization and computer simulation at the upper-undergraduate and graduate levels. It is also a valuable reference for researchers and practitioners working in the fields of mathematics, engineering, computer science, operations research, and management science who use metaheuristic algorithms to solve problems in their everyday work.

1,286 citations


BookDOI
TL;DR: The Handbook now includes updated chapters on the best known metaheuristics, including simulated annealing, tabu search, variable neighborhood search, scatter search and path relinking, genetic algorithms, memetic algorithms, genetic programming, ant colony optimization, and multi-start methods.
Abstract: The first edition of the Handbook of Metaheuristics was published in 2003 under the editorship of Fred Glover and Gary A. Kochenberger. Given the numerous developments observed in the field of metaheuristics in recent years, it appeared that the time was ripe for a second edition of the Handbook. When Glover and Kochenberger were unable to prepare this second edition, they suggested that Michel Gendreau and Jean-Yves Potvin should take over the editorship, and so this important new edition is now available. Through its 21 chapters, this second edition is designed to provide a broad coverage of the concepts, implementations and applications in this important field of optimization. Original contributors either revised or updated their work, or provided entirely new chapters. The Handbook now includes updated chapters on the best known metaheuristics, including simulated annealing, tabu search, variable neighborhood search, scatter search and path relinking, genetic algorithms, memetic algorithms, genetic programming, ant colony optimization, multi-start methods, greedy randomized adaptive search procedure, guided local search, hyper-heuristics and parallel metaheuristics. It also contains three new chapters on large neighborhood search, artificial immune systems and hybrid metaheuristics. The last four chapters are devoted to more general issues related to the field of metaheuristics, namely reactive search, stochastic search, fitness landscape analysis and performance comparison.

1,208 citations


Journal ArticleDOI
TL;DR: An improved ABC algorithm called gbest-guided ABC (GABC) algorithm is proposed by incorporating the information of global best (gbest) solution into the solution search equation to improve the exploitation of ABC algorithm.

1,105 citations


Book ChapterDOI
07 Mar 2010
TL;DR: Numerical studies and results suggest that the proposed Levy-flight firefly algorithm is superior to existing metaheuristic algorithms.
Abstract: Nature-inspired algorithms such as Particle Swarm Optimization and Firefly Algorithm are among the most powerful algorithms for optimization. In this paper, we intend to formulate a new metaheuristic algorithm by combining Levy flights with the search strategy via the Firefly Algorithm. Numerical studies and results suggest that the proposed Levy-flight firefly algorithm is superior to existing metaheuristic algorithms. Finally implications for further research and wider applications will be discussed.

928 citations


Book ChapterDOI
12 Jun 2010
TL;DR: It turns out that the proposed Fireworks Algorithm clearly outperforms the two variants of the PSOs in both convergence speed and global solution accuracy.
Abstract: Inspired by observing fireworks explosion, a novel swarm intelligence algorithm, called Fireworks Algorithm (FA), is proposed for global optimization of complex functions In the proposed FA, two types of explosion (search) processes are employed, and the mechanisms for keeping diversity of sparks are also well designed In order to demonstrate the validation of the FA, a number of experiments were conducted on nine benchmark test functions to compare the FA with two variants of particle swarm optimization (PSO) algorithms, namely Standard PSO and Clonal PSO It turns out from the results that the proposed FA clearly outperforms the two variants of the PSOs in both convergence speed and global solution accuracy.

857 citations


Journal ArticleDOI
TL;DR: Variable neighbourhood search (VNS) is a metaheuristic, or a framework for building heuristics, based upon systematic changes of neighbourhoods both in descent phase, to find a local minimum, and in perturbation phase to emerge from the corresponding valley as mentioned in this paper.
Abstract: Variable neighbourhood search (VNS) is a metaheuristic, or a framework for building heuristics, based upon systematic changes of neighbourhoods both in descent phase, to find a local minimum, and in perturbation phase to emerge from the corresponding valley It was first proposed in 1997 and has since then rapidly developed both in its methods and its applications In the present paper, these two aspects are thoroughly reviewed and an extensive bibliography is provided Moreover, one section is devoted to newcomers It consists of steps for developing a heuristic for any particular problem Those steps are common to the implementation of other metaheuristics

Book ChapterDOI
TL;DR: This chapter reviews developments in ACO and gives an overview of recent research trends, including the development of high-performing algorithmic variants and theoretical understanding of properties of ACO algorithms.
Abstract: Ant Colony Optimization (ACO) is a metaheuristic that is inspired by the pheromone trail laying and following behavior of some ant species. Artificial ants in ACO are stochastic solution construction procedures that build candidate solutions for the problem instance under concern by exploiting (artificial) pheromone information that is adapted based on the ants’ search experience and possibly available heuristic information. Since the proposal of the Ant System, the first ACO algorithm, many significant research results have been obtained. These contributions focused on the development of high-performing algorithmic variants, the development of a generic algorithmic framework for ACO algorithms, successful applications of ACO algorithms to a wide range of computationally hard problems, and the theoretical understanding of properties of ACO algorithms. This chapter reviews these developments and gives an overview of recent research trends in ACO.

Journal ArticleDOI
TL;DR: The main idea of the principle of PSO is presented; the advantages and the shortcomings are summarized; and some kinds of improved versions ofPSO and research situation are presented.
Abstract: Particle swarm optimization is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. The algorithm is widely used and rapidly developed for its easy implementation and few particles required to be tuned. The main idea of the principle of PSO is presented; the advantages and the shortcomings are summarized. At last this paper presents some kinds of improved versions of PSO and research situation, and the future research issues are also given.

Journal ArticleDOI
TL;DR: A literature review on exact, heuristic and metaheuristic methods that have been proposed for the solution of the hybrid flow shop problem is presented.

Journal ArticleDOI
01 Mar 2010
TL;DR: A novel hybrid algorithm named PSO-DE is proposed, which integrates particle swarm optimization (PSO) with differential evolution (DE) to solve constrained numerical and engineering optimization problems.
Abstract: We propose a novel hybrid algorithm named PSO-DE, which integrates particle swarm optimization (PSO) with differential evolution (DE) to solve constrained numerical and engineering optimization problems. Traditional PSO is easy to fall into stagnation when no particle discovers a position that is better than its previous best position for several generations. DE is incorporated into update the previous best positions of particles to force PSO jump out of stagnation, because of its strong searching ability. The hybrid algorithm speeds up the convergence and improves the algorithm's performance. We test the presented method on 11 well-known benchmark test functions and five engineering optimization functions. Comparisons show that PSO-DE outperforms or performs similarly to seven state-of-the-art approaches in terms of the quality of the resulting solutions.

Journal ArticleDOI
TL;DR: This work proposes a new metaheuristic, called chemical reaction optimization (CRO), which mimics the interactions of molecules in a chemical reaction to reach a low energy stable state and can outperform all other metaheuristics when matched to the right problem type.
Abstract: We encounter optimization problems in our daily lives and in various research domains. Some of them are so hard that we can, at best, approximate the best solutions with (meta-) heuristic methods. However, the huge number of optimization problems and the small number of generally acknowledged methods mean that more metaheuristics are needed to fill the gap. We propose a new metaheuristic, called chemical reaction optimization (CRO), to solve optimization problems. It mimics the interactions of molecules in a chemical reaction to reach a low energy stable state. We tested the performance of CRO with three nondeterministic polynomial-time hard combinatorial optimization problems. Two of them were traditional benchmark problems and the other was a real-world problem. Simulation results showed that CRO is very competitive with the few existing successful metaheuristics, having outperformed them in some cases, and CRO achieved the best performance in the real-world problem. Moreover, with the No-Free-Lunch theorem, CRO must have equal performance as the others on average, but it can outperform all other metaheuristics when matched to the right problem type. Therefore, it provides a new approach for solving optimization problems. CRO may potentially solve those problems which may not be solvable with the few generally acknowledged approaches.

Journal ArticleDOI
Bilal Alatas1
TL;DR: It has been detected that coupling emergent results in different areas, like those of ABC and complex dynamics, can improve the quality of results in some optimization problems.
Abstract: Artificial bee colony (ABC) is the one of the newest nature inspired heuristics for optimization problem. Like the chaos in real bee colony behavior, this paper proposes new ABC algorithms that use chaotic maps for parameter adaptation in order to improve the convergence characteristics and to prevent the ABC to get stuck on local solutions. This has been done by using of chaotic number generators each time a random number is needed by the classical ABC algorithm. Seven new chaotic ABC algorithms have been proposed and different chaotic maps have been analyzed in the benchmark functions. It has been detected that coupling emergent results in different areas, like those of ABC and complex dynamics, can improve the quality of results in some optimization problems. It has been also shown that, the proposed methods have somewhat increased the solution quality, that is in some cases they improved the global searching capability by escaping the local solutions.

Journal ArticleDOI
01 Mar 2010
TL;DR: It is found that not only the PSO method and its simplified variant have comparable performance for optimizing a number of Artificial Neural Network problems, but also the simplified variant appears to offer a small improvement in some cases.
Abstract: The general purpose optimization method known as Particle Swarm Optimization (PSO) has received much attention in past years, with many attempts to find the variant that performs best on a wide variety of optimization problems. The focus of past research has been with making the PSO method more complex, as this is frequently believed to increase its adaptability to other optimization problems. This study takes the opposite approach and simplifies the PSO method. To compare the efficacy of the original PSO and the simplified variant here, an easy technique is presented for efficiently tuning their behavioural parameters. The technique works by employing an overlaid meta-optimizer, which is capable of simultaneously tuning parameters with regard to multiple optimization problems, whereas previous approaches to meta-optimization have tuned behavioural parameters to work well on just a single optimization problem. It is then found that not only the PSO method and its simplified variant have comparable performance for optimizing a number of Artificial Neural Network problems, but also the simplified variant appears to offer a small improvement in some cases.

Book
25 Jul 2010
TL;DR: This book reviews and introduces the state-of-the-art nature-inspired metaheuristic algorithms for global optimization, including ant and bee algorithms, bat algorithm, cuckoo search, differential evolution, firefly algorithm, genetic algorithms, harmony search, particle swarm optimization, simulated annealing and support vector machines.
Abstract: Modern metaheuristic algorithms such as particle swarm optimization and cuckoo search start to demonstrate their power in dealing with tough optimization problems and even NP-hard problems. This book reviews and introduces the state-of-the-art nature-inspired metaheuristic algorithms for global optimization, including ant and bee algorithms, bat algorithm, cuckoo search, differential evolution, firefly algorithm, genetic algorithms, harmony search, particle swarm optimization, simulated annealing and support vector machines. In this revised edition, we also include how to deal with nonlinear constraints. Worked examples with implementation have been used to show how each algorithm works. This book is thus an ideal textbook for an undergraduate and/or graduate course as well as for self study. As some of the algorithms such as the cuckoo search and firefly algorithms are at the forefront of current research, this book can also serve as a reference for researchers.

Journal ArticleDOI
TL;DR: A modified discrete particle swarm optimization (PSO) algorithm is developed which dynamically accounts for the relevance and dependence of the features included the feature subset in an adaptive feature selection procedure.

Journal ArticleDOI
TL;DR: The OptFlux software is freely available, together with documentation and other resources, thus bridging the gap from research in strain optimization algorithms and the final users, thus providing a user-friendly computational tool for Metabolic Engineering applications.
Abstract: Over the last few years a number of methods have been proposed for the phenotype simulation of microorganisms under different environmental and genetic conditions. These have been used as the basis to support the discovery of successful genetic modifications of the microbial metabolism to address industrial goals. However, the use of these methods has been restricted to bioinformaticians or other expert researchers. The main aim of this work is, therefore, to provide a user-friendly computational tool for Metabolic Engineering applications. OptFlux is an open-source and modular software aimed at being the reference computational application in the field. It is the first tool to incorporate strain optimization tasks, i.e., the identification of Metabolic Engineering targets, using Evolutionary Algorithms/Simulated Annealing metaheuristics or the previously proposed OptKnock algorithm. It also allows the use of stoichiometric metabolic models for (i) phenotype simulation of both wild-type and mutant organisms, using the methods of Flux Balance Analysis, Minimization of Metabolic Adjustment or Regulatory on/off Minimization of Metabolic flux changes, (ii) Metabolic Flux Analysis, computing the admissible flux space given a set of measured fluxes, and (iii) pathway analysis through the calculation of Elementary Flux Modes. OptFlux also contemplates several methods for model simplification and other pre-processing operations aimed at reducing the search space for optimization algorithms. The software supports importing/exporting to several flat file formats and it is compatible with the SBML standard. OptFlux has a visualization module that allows the analysis of the model structure that is compatible with the layout information of Cell Designer, allowing the superimposition of simulation results with the model graph. The OptFlux software is freely available, together with documentation and other resources, thus bridging the gap from research in strain optimization algorithms and the final users. It is a valuable platform for researchers in the field that have available a number of useful tools. Its open-source nature invites contributions by all those interested in making their methods available for the community. Given its plug-in based architecture it can be extended with new functionalities. Currently, several plug-ins are being developed, including network topology analysis tools and the integration with Boolean network based regulatory models.

Journal ArticleDOI
TL;DR: The paper reveals the complexity of the scheduling problem in Computational Grids when compared to scheduling in classical parallel and distributed systems and shows the usefulness of heuristic and meta-heuristic approaches for the design of efficient Grid schedulers.

Journal ArticleDOI
TL;DR: In the proposed SGHS algorithm, a new improvisation scheme is developed so that the good information captured in the current global best solution can be well utilized to generate new harmonies.

Journal Article
TL;DR: This paper proposes a new metaheuristic method, the Bat Algo-rithm, based on the echo location behaviour of bats, and compares it with other existing algorithms, including genetic algorithms and particle swarm optimization.
Abstract: Metaheuristic algorithms such as particle swarm optimization, mremy algorithm andharmony search are now becoming powerful methods for solving many tough optimization problems. In this paper, we propose a new metaheuristic method, the Bat Algo-rithm, based on the echo location behaviour of bats. We also intend to combine theadvantages of existing algorithms into the new bat algorithm. After a detailed formu-lation and explanation of its implementation, we will then compare the proposed algo-rithm with other existing algorithms, including genetic algorithms and particle swarm optimization. Simulations show that the proposed algorithm seems much superior to other algorithms, and further studies are also discussed.

Book
04 Nov 2010
TL;DR: The presenters show how runtime behavior can be analyzed in a rigorous way, in particular for combinatorial optimization, and show how multiobjective optimization can help to speed up bioinspired computation for single-objectives optimization problems.
Abstract: Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials explains the most important results achieved in this area.The presenters show how runtime behavior can be analyzed in a rigorous way, in particular for combinatorial optimization. They present well-known problems such as minimum spanning trees, shortest paths, maximum matching, and covering and scheduling problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help to speed up bioinspired computation for single-objective optimization problems.The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com.

Journal ArticleDOI
TL;DR: This paper presents an improved ant colony optimization (IACO) for constrained engineering design problems that has the capacity to handle continuous and discrete problems by using sub‐optimization mechanism (SOM), based on the principles of finite element method working as a search‐space updating technique.
Abstract: Purpose – The computational drawbacks of existing numerical methods have forced researchers to rely on heuristic algorithms. Heuristic methods are powerful in obtaining the solution of optimization problems. Although they are approximate methods (i.e. their solution are good, but not provably optimal), they do not require the derivatives of the objective function and constraints. Also, they use probabilistic transition rules instead of deterministic rules. The purpose of this paper is to present an improved ant colony optimization (IACO) for constrained engineering design problems.Design/methodology/approach – IACO has the capacity to handle continuous and discrete problems by using sub‐optimization mechanism (SOM). SOM is based on the principles of finite element method working as a search‐space updating technique. Also, SOM can reduce the size of pheromone matrices, decision vectors and the number of evaluations. Though IACO decreases pheromone updating operations as well as optimization time, the proba...

Journal ArticleDOI
TL;DR: In this article, a quantum-inspired particle swarm optimization (QPSO) is proposed, which has stronger search ability and quicker convergence speed, not only because of the introduction of quantum computing theory, but also due to two special implementations: self-adaptive probability selection and chaotic sequences mutation.
Abstract: Economic load dispatch (ELD) is an important topic in the operation of power plants which can help to build up effective generating management plans. The ELD problem has nonsmooth cost function with equality and inequality constraints which make it difficult to be effectively solved. Different heuristic optimization methods have been proposed to solve this problem in previous study. In this paper, quantum-inspired particle swarm optimization (QPSO) is proposed, which has stronger search ability and quicker convergence speed, not only because of the introduction of quantum computing theory, but also due to two special implementations: self-adaptive probability selection and chaotic sequences mutation. The proposed approach is tested with five standard benchmark functions and three power system cases consisting of 3, 13, and 40 thermal units. Comparisons with similar approaches including the evolutionary programming (EP), genetic algorithm (GA), immune algorithm (IA), and other versions of particle swarm optimization (PSO) are given. The promising results illustrate the efficiency of the proposed method and show that it could be used as a reliable tool for solving ELD problems.

Proceedings ArticleDOI
18 Jul 2010
TL;DR: The design issues underlying jMetal are described, focusing mainly on its internal architecture, with the aim of offering a comprehensive view of its main features to interested researchers.
Abstract: jMetal is a Java-based framework for multi-objective optimization using metaheuristics. It is a flexible, extensible, and easy-to-use software package that has been used in a wide range of applications. In this paper, we describe the design issues underlying jMetal, focusing mainly on its internal architecture, with the aim of offering a comprehensive view of its main features to interested researchers. Among the covered topics, we detail the basic components facilitating the implementation of multi-objective metaheuristics (solution representations, operators, problems, density estimators, archives), the included quality indicators to assess the performance of the algorithms, and jMetal's support to carry out full experimental studies.

Journal ArticleDOI
01 Jun 2010
TL;DR: A Knowledge-Based Ant Colony Optimization (KBACO) algorithm is proposed in this paper for the Flexible Job Shop Scheduling Problem (FJSSP) and results indicate that the proposed KBACO algorithm outperforms some current approaches in the quality of schedules.
Abstract: A Knowledge-Based Ant Colony Optimization (KBACO) algorithm is proposed in this paper for the Flexible Job Shop Scheduling Problem (FJSSP). KBACO algorithm provides an effective integration between Ant Colony Optimization (ACO) model and knowledge model. In the KBACO algorithm, knowledge model learns some available knowledge from the optimization of ACO, and then applies the existing knowledge to guide the current heuristic searching. The performance of KBACO was evaluated by a large range of benchmark instances taken from literature and some generated by ourselves. Final experimental results indicate that the proposed KBACO algorithm outperforms some current approaches in the quality of schedules.

Journal ArticleDOI
TL;DR: A multi-layer framework that combines stochastic optimization, filtering, and local optimization is introduced and quantitative 3D pose tracking results for the complete HumanEva-II dataset are provided.
Abstract: Local optimization and filtering have been widely applied to model-based 3D human motion capture. Global stochastic optimization has recently been proposed as promising alternative solution for tracking and initialization. In order to benefit from optimization and filtering, we introduce a multi-layer framework that combines stochastic optimization, filtering, and local optimization. While the first layer relies on interacting simulated annealing and some weak prior information on physical constraints, the second layer refines the estimates by filtering and local optimization such that the accuracy is increased and ambiguities are resolved over time without imposing restrictions on the dynamics. In our experimental evaluation, we demonstrate the significant improvements of the multi-layer framework and provide quantitative 3D pose tracking results for the complete HumanEva-II dataset. The paper further comprises a comparison of global stochastic optimization with particle filtering, annealed particle filtering, and local optimization.