scispace - formally typeset
Search or ask a question

Showing papers on "Multi-swarm optimization published in 2008"


Journal ArticleDOI
TL;DR: This paper discusses natural biogeography and its mathematics, and then discusses how it can be used to solve optimization problems, and sees that BBO has features in common with other biology-based optimization methods, such as GAs and particle swarm optimization (PSO).
Abstract: Biogeography is the study of the geographical distribution of biological organisms. Mathematical equations that govern the distribution of organisms were first discovered and developed during the 1960s. The mindset of the engineer is that we can learn from nature. This motivates the application of biogeography to optimization problems. Just as the mathematics of biological genetics inspired the development of genetic algorithms (GAs), and the mathematics of biological neurons inspired the development of artificial neural networks, this paper considers the mathematics of biogeography as the basis for the development of a new field: biogeography-based optimization (BBO). We discuss natural biogeography and its mathematics, and then discuss how it can be used to solve optimization problems. We see that BBO has features in common with other biology-based optimization methods, such as GAs and particle swarm optimization (PSO). This makes BBO applicable to many of the same types of problems that GAs and PSO are used for, namely, high-dimension problems with multiple local optima. However, BBO also has some features that are unique among biology-based optimization methods. We demonstrate the performance of BBO on a set of 14 standard benchmarks and compare it with seven other biology-based optimization algorithms. We also demonstrate BBO on a real-world sensor selection problem for aircraft engine health estimation.

3,418 citations


Journal ArticleDOI
01 Jan 2008
TL;DR: The simulation results show that the performance of ABC algorithm is comparable to those of differential evolution, particle swarm optimization and evolutionary algorithm and can be efficiently employed to solve engineering problems with high dimensionality.
Abstract: Artificial bee colony (ABC) algorithm is an optimization algorithm based on a particular intelligent behaviour of honeybee swarms. This work compares the performance of ABC algorithm with that of differential evolution (DE), particle swarm optimization (PSO) and evolutionary algorithm (EA) for multi-dimensional numeric problems. The simulation results show that the performance of ABC algorithm is comparable to those of the mentioned algorithms and can be efficiently employed to solve engineering problems with high dimensionality.

3,242 citations


Journal ArticleDOI
TL;DR: This paper presents a detailed overview of the basic concepts of PSO and its variants, and provides a comprehensive survey on the power system applications that have benefited from the powerful nature ofPSO as an optimization technique.
Abstract: Many areas in power systems require solving one or more nonlinear optimization problems. While analytical methods might suffer from slow convergence and the curse of dimensionality, heuristics-based swarm intelligence can be an efficient alternative. Particle swarm optimization (PSO), part of the swarm intelligence family, is known to effectively solve large-scale nonlinear optimization problems. This paper presents a detailed overview of the basic concepts of PSO and its variants. Also, it provides a comprehensive survey on the power system applications that have benefited from the powerful nature of PSO as an optimization technique. For each application, technical details that are required for applying PSO, such as its type, particle formulation (solution representation), and the most efficient fitness functions are also discussed.

2,147 citations


Journal ArticleDOI
01 Sep 2008
TL;DR: Experimental results showed the proposed PSO-SVM model can correctly select the discriminating input features and also achieve high classification accuracy.
Abstract: This study proposed a novel PSO-SVM model that hybridized the particle swarm optimization (PSO) and support vector machines (SVM) to improve the classification accuracy with a small and appropriate feature subset. This optimization mechanism combined the discrete PSO with the continuous-valued PSO to simultaneously optimize the input feature subset selection and the SVM kernel parameter setting. The hybrid PSO-SVM data mining system was implemented via a distributed architecture using the web service technology to reduce the computational time. In a heterogeneous computing environment, the PSO optimization was performed on the application server and the SVM model was trained on the client (agent) computer. The experimental results showed the proposed approach can correctly select the discriminating input features and also achieve high classification accuracy.

499 citations


Journal ArticleDOI
01 Mar 2008
TL;DR: A hybrid method combining two heuristic optimization techniques, genetic algorithms (GA) and particle swarm optimization (PSO), for the global optimization of multimodal functions, which demonstrates the superiority of the hybrid GA-PSO approach over the other four search techniques in terms of solution quality and convergence rates.
Abstract: Heuristic optimization provides a robust and efficient approach for solving complex real-world problems. The focus of this research is on a hybrid method combining two heuristic optimization techniques, genetic algorithms (GA) and particle swarm optimization (PSO), for the global optimization of multimodal functions. Denoted as GA-PSO, this hybrid technique incorporates concepts from GA and PSO and creates individuals in a new generation not only by crossover and mutation operations as found in GA but also by mechanisms of PSO. The results of various experimental studies using a suite of 17 multimodal test functions taken from the literature have demonstrated the superiority of the hybrid GA-PSO approach over the other four search techniques in terms of solution quality and convergence rates.

491 citations


Journal ArticleDOI
TL;DR: This paper proposes to apply a novel self-organizing hierarchical particle swarm optimization (SOH_PSO) for the nonconvex economic dispatch (NCED) and shows that the proposed approach outperforms previous methods for NCED.
Abstract: The economic dispatch has the objective of generation allocation to the power generators in such a manner that the total fuel cost is minimized while all operating constraints are satisfied. Conventional optimization methods assume generator cost curves to be continuous and monotonically increasing, but modern generators have a variety of nonlinearities in their cost curves making this assumption inaccurate, and the resulting approximate dispatches cause a lot of revenue loss. Evolutionary methods like particle swarm optimization perform better for such problems as no convexity assumptions are imposed, but these methods converge to sub-optimum solutions prematurely, particularly for multimodal problems. To handle the problem of premature convergence, this paper proposes to apply a novel self-organizing hierarchical particle swarm optimization (SOH_PSO) for the nonconvex economic dispatch (NCED). The performance further improves when time-varying acceleration coefficients are included. The results show that the proposed approach outperforms previous methods for NCED.

484 citations


Journal ArticleDOI
TL;DR: This paper aims to offer a compendious and timely review of the field and the challenges and opportunities offered by this welcome addition to the optimization toolbox.
Abstract: Particle Swarm Optimization (PSO), in its present form, has been in existence for roughly a decade, with formative research in related domains (such as social modelling, computer graphics, simulation and animation of natural swarms or flocks) for some years before that; a relatively short time compared with some of the other natural computing paradigms such as artificial neural networks and evolutionary computation. However, in that short period, PSO has gained widespread appeal amongst researchers and has been shown to offer good performance in a variety of application domains, with potential for hybridisation and specialisation, and demonstration of some interesting emergent behaviour. This paper aims to offer a compendious and timely review of the field and the challenges and opportunities offered by this welcome addition to the optimization toolbox. Part I discusses the location of PSO within the broader domain of natural computing, considers the development of the algorithm, and refinements introduced to prevent swarm stagnation and tackle dynamic environments. Part II considers current research in hybridisation, combinatorial problems, multicriteria and constrained optimization, and a range of indicative application areas.

475 citations


Book ChapterDOI
22 Sep 2008
TL;DR: An adaptive particle swarm optimization with adaptive parameters and elitist learning strategy (ELS) based on the evolutionary state estimation (ESE) approach is proposed, resulting in substantially improved quality of global solutions.
Abstract: This paper proposes an adaptive particle swarm optimization (APSO) with adaptive parameters and elitist learning strategy (ELS) based on the evolutionary state estimation (ESE) approach. The ESE approach develops an `evolutionary factor' by using the population distribution information and relative particle fitness information in each generation, and estimates the evolutionary state through a fuzzy classification method. According to the identified state and taking into account various effects of the algorithm-controlling parameters, adaptive control strategies are developed for the inertia weight and acceleration coefficients for faster convergence speed. Further, an adaptive `elitist learning strategy' (ELS) is designed for the best particle to jump out of possible local optima and/or to refine its accuracy, resulting in substantially improved quality of global solutions. The APSO algorithm is tested on 6 unimodal and multimodal functions, and the experimental results demonstrate that the APSO generally outperforms the compared PSOs, in terms of solution accuracy, convergence speed and algorithm reliability.

442 citations


Book ChapterDOI
01 Jan 2008
TL;DR: This chapter provides two recent algorithms for evolutionary optimization – well known as particle swarm optimization (PSO) and differential evolution (DE), inspired by biological and sociological motivations and can take care of optimality on rough, discontinuous and multimodal surfaces.
Abstract: Since the beginning of the nineteenth century, a significant evolution in optimization theory has been noticed. Classical linear programming and traditional non-linear optimization techniques such as Lagrange’s Multiplier, Bellman’s principle and Pontyagrin’s principle were prevalent until this century. Unfortunately, these derivative based optimization techniques can no longer be used to determine the optima on rough non-linear surfaces. One solution to this problem has already been put forward by the evolutionary algorithms research community. Genetic algorithm (GA), enunciated by Holland, is one such popular algorithm. This chapter provides two recent algorithms for evolutionary optimization – well known as particle swarm optimization (PSO) and differential evolution (DE). The algorithms are inspired by biological and sociological motivations and can take care of optimality on rough, discontinuous and multimodal surfaces. The chapter explores several schemes for controlling the convergence behaviors of PSO and DE by a judicious selection of their parameters. Special emphasis is given on the hybridizations of PSO and DE algorithms with other soft computing tools. The article finally discusses the mutual synergy of PSO with DE leading to a more powerful global search algorithm and its practical applications.

426 citations


Book ChapterDOI
TL;DR: This chapter focuses on two of the most successful examples of optimization techniques inspired by swarm intelligence: ant colony optimization and particle swarm optimization.
Abstract: Optimization techniques inspired by swarm intelligence have become increasingly popular during the last decade. They are characterized by a decentralized way of working that mimics the behavior of swarms of social insects, flocks of birds, or schools of fish. The advantage of these approaches over traditional techniques is their robustness and flexibility. These properties make swarm intelligence a successful design paradigm for algorithms that deal with increasingly complex problems. In this chapter we focus on two of the most successful examples of optimization techniques inspired by swarm intelligence: ant colony optimization and particle swarm optimization. Ant colony optimization was introduced as a technique for combinatorial optimization in the early 1990s. The inspiring source of ant colony optimization is the foraging behavior of real ant colonies. In addition, particle swarm optimization was introduced for continuous optimization in the mid-1990s, inspired by bird flocking.

389 citations


Journal ArticleDOI
01 Sep 2008
TL;DR: In this article, the authors presented the application and performance comparison of particle swarm optimization (PSO) and genetic algorithms (GA) for flexible ac transmission system (FACTS)-based controller design.
Abstract: Recently, genetic algorithms (GA) and particle swarm optimization (PSO) technique have attracted considerable attention among various modern heuristic optimization techniques. The GA has been popular in academia and the industry mainly because of its intuitiveness, ease of implementation, and the ability to effectively solve highly non-linear, mixed integer optimization problems that are typical of complex engineering systems. PSO technique is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. Since the two approaches are supposed to find a solution to a given objective function but employ different strategies and computational effort, it is appropriate to compare their performance. This paper presents the application and performance comparison of PSO and GA optimization techniques, for flexible ac transmission system (FACTS)-based controller design. The design objective is to enhance the power system stability. The design problem of the FACTS-based controller is formulated as an optimization problem and both PSO and GA optimization techniques are employed to search for optimal controller parameters. The performance of both optimization techniques in terms of computational effort, computational time and convergence rate is compared. Further, the optimized controllers are tested on a weakly connected power system subjected to different disturbances over a wide range of loading conditions and parameter variations and their performance is compared with the conventional power system stabilizer (CPSS). The eigenvalue analysis and non-linear simulation results are presented and compared to show the effectiveness of both the techniques in designing a FACTS-based controller, to enhance power system stability.

Journal ArticleDOI
TL;DR: The original version uses fixed population size but a method for gradually reducing population size is proposed, which improves the efficiency and robustness of the algorithm and can be applied to any variant of a Differential Evolution algorithm.
Abstract: This paper studies the efficiency of a recently defined population-based direct global optimization method called Differential Evolution with self-adaptive control parameters The original version uses fixed population size but a method for gradually reducing population size is proposed in this paper It improves the efficiency and robustness of the algorithm and can be applied to any variant of a Differential Evolution algorithm The proposed modification is tested on commonly used benchmark problems for unconstrained optimization and compared with other optimization methods such as Evolutionary Algorithms and Evolution Strategies

Journal ArticleDOI
TL;DR: An optimization procedure which specializes in solving a wide variety of optimization problems, designed as a generic multi-objective, multi-optima optimizer which automatically degenerates to efficient algorithms for solving other simpler optimization problems.

Journal ArticleDOI
TL;DR: It is shown that the PSO method is efficient for both minimization and construction of the confidence region of parameter estimates and that the elliptical approximation of confidence regions of nonlinear model parameters can be very poor sometimes and that more accurate likelihood confidence regions can be constructed with PSO.

Journal Article
TL;DR: Solving Engineering Optimization Problems with the Simple Constrained Particle Swarm Optimizer.
Abstract: Solving Engineering Optimization Problems with the Simple Constrained Particle Swarm Optimizer

Journal ArticleDOI
TL;DR: Improved PSO approaches for solving EDPs that takes into account nonlinear generator features such as ramp-rate limits and prohibited operating zones in the power system operation are proposed.

Proceedings ArticleDOI
01 Jun 2008
TL;DR: This paper takes a look at some of the different definitions of swarm diversity with the intention of determining their usefulness in quantifying swarm exploration/exploitation to lay the foundations for the development of a suitable means to quantify the rate of change of diversity.
Abstract: An important factor contributing to the success of particle swarm optimization (PSO) is the balance between exploration and exploitation of the swarm. Exploration is typically preferred at the initial stages of the search but is required to gradually give way to exploitation of promising solutions as the search progresses. The diversity of a particle swarm optimization algorithm can be defined, simply, as the degree of dispersion of the particles in the swarm. This dispersion could be defined around some center-point or not. It could also be defined based on the positions of the particles or on their velocities. This paper takes a look at some of the different definitions of swarm diversity with the intention of determining their usefulness in quantifying swarm exploration/exploitation. This work is intended to lay the foundations for the development of a suitable means to quantify the rate of change from exploration to exploitation of a PSO, i.e. the rate of change of diversity.

Journal ArticleDOI
TL;DR: An improved particle swarm optimization algorithm (IPSO) to improve the performance of standard PSO, which uses the dynamic inertia weight that decreases according to iterative generation increasing.
Abstract: Particle swarm optimization (PSO) algorithm has been developing rapidly and has been applied widely since it was introduced, as it is easily understood and realized. This paper presents an improved particle swarm optimization algorithm (IPSO) to improve the performance of standard PSO, which uses the dynamic inertia weight that decreases according to iterative generation increasing. It is tested with a set of 6 benchmark functions with 30, 50 and 150 different dimensions and compared with standard PSO. Experimental results indicate that the IPSO improves the search performance on the benchmark functions significantly.

Journal ArticleDOI
01 Jun 2008
TL;DR: A new hybrid particle swarm optimization that incorporates a wavelet-theory-based mutation operation is proposed that significantly outperforms the existing methods in terms of convergence speed, solution quality, and solution stability.
Abstract: A new hybrid particle swarm optimization (PSO) that incorporates a wavelet-theory-based mutation operation is proposed. It applies the wavelet theory to enhance the PSO in exploring the solution space more effectively for a better solution. A suite of benchmark test functions and three industrial applications (solving the load flow problems, modeling the development of fluid dispensing for electronic packaging, and designing a neural-network-based controller) are employed to evaluate the performance and the applicability of the proposed method. Experimental results empirically show that the proposed method significantly outperforms the existing methods in terms of convergence speed, solution quality, and solution stability.

Journal ArticleDOI
TL;DR: The algorithm is compared with other state-of-the-art SA algorithms and advanced global optimization methods and found better designs than the other SA-based algorithms and converged much more quickly to the optimum than HPSO and HS.

Journal ArticleDOI
TL;DR: MEPSO has pretty good performance on almost all testing problems adopted in this paper, and outperforms other algorithms when the dynamic environment is unimodal and changes severely, or has a great number of local optima as dynamic Rastrigin function does.

Proceedings ArticleDOI
01 Jun 2008
TL;DR: The performance of dynamic multi-swarm particle swarm optimizer (DMS-PSO) on the set of benchmark functions provided for the CEC2008 Special Session on Large Scale optimization is reported.
Abstract: In this paper, the performance of dynamic multi-swarm particle swarm optimizer (DMS-PSO) on the set of benchmark functions provided for the CEC2008 Special Session on Large Scale optimization is reported Different from the existing multi-swarm PSOs and local versions of PSO, the sub-swarms are dynamic and the sub-swarmspsila size is very small The whole population is divided into a large number sub-swarms, these sub-swarms are regrouped frequently by using various regrouping schedules and information is exchanged among the particles in the whole swarm The Quasi-Newton method is combined to improve its local searching ability

Journal ArticleDOI
TL;DR: This work presents a novel Quantum-behaved PSO (QPSO) using chaotic mutation operator, and demonstrates good performance of the QPSO in solving a well-studied continuous optimization problem of mechanical engineering design.
Abstract: Particle swarm optimization (PSO) is a population-based swarm intelligence algorithm that shares many similarities with evolutionary computation techniques. However, the PSO is driven by the simulation of a social psychological metaphor motivated by collective behaviors of bird and other social organisms instead of the survival of the fittest individual. Inspired by the classical PSO method and quantum mechanics theories, this work presents a novel Quantum-behaved PSO (QPSO) using chaotic mutation operator. The application of chaotic sequences based on chaotic Zaslavskii map instead of random sequences in QPSO is a powerful strategy to diversify the QPSO population and improve the QPSO’s performance in preventing premature convergence to local minima. The simulation results demonstrate good performance of the QPSO in solving a well-studied continuous optimization problem of mechanical engineering design.

Journal ArticleDOI
TL;DR: A novel three-state approach inspired from the discrete version of a powerful heuristic algorithm, particle swarm optimization, is developed and presented to determine the optimum number and locations of two types of switches in radial distribution systems.
Abstract: Achieving high-distribution reliability levels and concurrently minimizing capital costs can be considered as the main issues in distribution system optimization Determination of the optimum number and location of switches in distribution system automation is an important issue from the reliability and economical points of view In this paper, a novel three-state approach inspired from the discrete version of a powerful heuristic algorithm, particle swarm optimization, is developed and presented to determine the optimum number and locations of two types of switches (sectionalizers and breakers) in radial distribution systems The novelty of the proposed algorithm is to simultaneously consider both sectionalizer and breaker switches The feasibility of the proposed algorithm is examined by application to two distribution systems The proposed solution approach provides a global optimal solution for the switch placement problem

Journal ArticleDOI
TL;DR: An improved particle swarm optimization and discrete PSO (DPSO) with an enhancement operation by using a self-adaptive evolution strategies (ES) is proposed for joint optimization of three-layer feedforward artificial neural network structure and parameters (weights and bias), which is named ESPNet.

Journal ArticleDOI
Xiaoxin Guo1, Jinhui Yang1, C. G. Wu1, Chaoyong Wang1, Yanhua Liang1 
TL;DR: A novel hyper-parameter selection method for LS-SVMs is presented based on the particle swarm optimization (PSO), which does not need any priori knowledge on the analytic property of the generalization performance measure and can be used to determine multiplehyper-parameters at the same time.

Journal ArticleDOI
TL;DR: A heuristic approach based on particle swarm optimization algorithm is adopted to solving task scheduling problem in grid environment and the results of simulated experiments show that the particle swarm optimized algorithm is able to get the better schedule than genetic algorithm.
Abstract: Grid computing is a high performance computing environment to solve larger scale computational demands. Grid computing contains resource management, task scheduling, security problems, information management and so on. Task scheduling is a fundamental issue in achieving high performance in grid computing systems. However, it is a big challenge for efficient scheduling algorithm design and implementation. In this paper, a heuristic approach based on particle swarm optimization algorithm is adopted to solving task scheduling problem in grid environment. Each particle is represented a possible solution, and the position vector is transformed from the continuous variable to the discrete variable. This approach aims to generate an optimal schedule so as to get the minimum completion time while completing the tasks. The results of simulated experiments show that the particle swarm optimization algorithm is able to get the better schedule than genetic algorithm.

Proceedings ArticleDOI
23 Jun 2008
TL;DR: Experimental results demonstrate that the proposed sequential PSO (particle swarm optimization) framework for visual tracking is more robust and effective, especially when the object has an arbitrary motion or undergoes large appearance changes.
Abstract: Visual tracking usually involves an optimization process for estimating the motion of an object from measured images in a video sequence. In this paper, a new evolutionary approach, PSO (particle swarm optimization), is adopted for visual tracking. Since the tracking process is a dynamic optimization problem which is simultaneously influenced by the object state and the time, we propose a sequential particle swarm optimization framework by incorporating the temporal continuity information into the traditional PSO algorithm. In addition, the parameters in PSO are changed adaptively according to the fitness values of particles and the predicted motion of the tracked object, leading to a favourable performance in tracking applications. Furthermore, we show theoretically that, in a Bayesian inference view, the sequential PSO framework is in essence a multilayer importance sampling based particle filter. Experimental results demonstrate that, compared with the state-of-the-art particle filter and its variation - the unscented particle filter, the proposed tracking algorithm is more robust and effective, especially when the object has an arbitrary motion or undergoes large appearance changes.

Journal ArticleDOI
TL;DR: This work has applied one of the variants of this algorithm to two case studies: the Hanoi water distribution network and the New York City water supply tunnel system, and presented a detailed comparison of the new results with those previously obtained by other authors.
Abstract: In the past decade, evolutionary methods have been used by various researchers to tackle optimal design problems for water supply systems (WSS). Particle Swarm Optimization (PSO) is one of these evolutionary algorithms which, in spite of the fact that it has primarily been developed for the solution of optimization problems with continuous variables, has been successfully adapted in other contexts to problems with discrete variables. In this work we have applied one of the variants of this algorithm to two case studies: the Hanoi water distribution network and the New York City water supply tunnel system. Both cases occur frequently in the related literature and provide two standard networks for benchmarking studies. This allows us to present a detailed comparison of our new results with those previously obtained by other authors.

Journal ArticleDOI
01 Sep 2008
TL;DR: A modified priority-based encoding incorporating a heuristic operator for reducing the possibility of loop-formation in the path construction process is proposed for particle representation in PSO, which surpasses those of recently reported genetic algorithm based approaches for this problem.
Abstract: This paper presents the investigations on the application of particle swarm optimization (PSO) to solve shortest path (SP) routing problems. A modified priority-based encoding incorporating a heuristic operator for reducing the possibility of loop-formation in the path construction process is proposed for particle representation in PSO. Simulation experiments have been carried out on different network topologies for networks consisting of 15-70 nodes. It is noted that the proposed PSO-based approach can find the optimal path with good success rates and also can find closer sub-optimal paths with high certainty for all the tested networks. It is observed that the performance of the proposed algorithm surpasses those of recently reported genetic algorithm based approaches for this problem.