scispace - formally typeset
Search or ask a question
Journal Article

A Parallel Particle Swarm Optimization Algorithm with Communication Strategies

01 Jan 2005-Journal of Information Science and Engineering (Institute of Information Science Academia Sinica)-Vol. 21, Iss: 4, pp 809-818
TL;DR: A parallel version of the particle swarm optimization (PPSO) algorithm together with three communication strategies which can be used according to the independence of the data, which demonstrates the usefulness of the proposed PPSO algorithm.
Abstract: Particle swarm optimization (PSO) is an alternative population-based evolutionary computation technique. It has been shown to be capable of optimizing hard mathematical problems in continuous or binary space. We present here a parallel version of the particle swarm optimization (PPSO) algorithm together with three communication strategies which can be used according to the independence of the data. The first strategy is designed for solution parameters that are independent or are only loosely correlated, such as the Rosenbrock and Rastrigrin functions. The second communication strategy can be applied to parameters that are more strongly correlated such as the Griewank function. In cases where the properties of the parameters are unknown, a third hybrid communication strategy can be used. Experimental results demonstrate the usefulness of the proposed PPSO algorithm.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper aims to offer a compendious and timely review of the field and the challenges and opportunities offered by this welcome addition to the optimization toolbox.
Abstract: Particle Swarm Optimization (PSO), in its present form, has been in existence for roughly a decade, with formative research in related domains (such as social modelling, computer graphics, simulation and animation of natural swarms or flocks) for some years before that; a relatively short time compared with some of the other natural computing paradigms such as artificial neural networks and evolutionary computation. However, in that short period, PSO has gained widespread appeal amongst researchers and has been shown to offer good performance in a variety of application domains, with potential for hybridisation and specialisation, and demonstration of some interesting emergent behaviour. This paper aims to offer a compendious and timely review of the field and the challenges and opportunities offered by this welcome addition to the optimization toolbox. Part I discusses the location of PSO within the broader domain of natural computing, considers the development of the algorithm, and refinements introduced to prevent swarm stagnation and tackle dynamic environments. Part II considers current research in hybridisation, combinatorial problems, multicriteria and constrained optimization, and a range of indicative application areas.

585 citations


Cites methods from "A Parallel Particle Swarm Optimizat..."

  • ...Chang et al. (2005) presented a parallel PSO in which each particle evaluated the fitness function independently, and demonstrated that the efficiency of the algorithm is dependent on the strategy for communicating pg, and that each strategy they investigated had its own merits that could be…...

    [...]

  • ...Chang et al. (2005) presented a parallel PSO in which each particle evaluated the fitness function independently, and demonstrated that the efficiency of the algorithm is dependent on the strategy for communicating pg, and that each strategy they investigated had its own merits that could be exploited where the correlation of the solution parameters was known a priori....

    [...]

Journal Article
TL;DR: Experimental results using six test functions demonstrate that CSO has much better performance than Particle Swarm Optimization (PSO).
Abstract: In this paper, we present a new algorithm of swarm intelligence, namely, Cat Swarm Optimization (CSO). CSO is generated by observing the behaviors of cats, and composed of two sub-models, i.e., tracing mode and seeking mode, which model upon the behaviors of cats. Experimental results using six test functions demonstrate that CSO has much better performance than Particle Swarm Optimization (PSO).

496 citations


Cites background from "A Parallel Particle Swarm Optimizat..."

  • ...All the experiments demonstrate the proposed Cat Swarm Optimization (CSO) is superior to PSO and PSO with weighting factor....

    [...]

  • ...According to the literatures, PSO with weighting factor [4] usually finds the better solution faster than the pure PSO, but according to the experimental results, Cat Swarm Optimization (CSO) presents even much better performance....

    [...]

  • ...By studying the behavior of ants achieves ACO, and with examining the movements of the flocking gulls realizes PSO....

    [...]

  • ...Genetic Algorithm (GA) [1-2], Ant Colony Optimization (ACO) [6-7], Particle Swarm Optimization (PSO) [3-5], and Simulated Annealing (SA) [8-9] etc....

    [...]

  • ...We applied CSO, PSO and PSO with weighting factor into six test functions to compare the performance....

    [...]

Journal ArticleDOI
TL;DR: A novel particle swarm optimization (PSO)-based algorithm for the traveling salesman problem (TSP) is presented and it has been shown that the size of the solved problems could be increased by using the proposed algorithm.

401 citations


Cites background from "A Parallel Particle Swarm Optimizat..."

  • ...of probabilities that a bit will be in one state or the other [5]....

    [...]

Journal ArticleDOI
TL;DR: An enhanced PSO algorithm called GOPSO is presented, which employs generalized opposition-based learning (GOBL) and Cauchy mutation to overcome the problem of premature convergence when solving complex problems.

384 citations


Cites methods from "A Parallel Particle Swarm Optimizat..."

  • ...[5,6] proposed a parallel PSO by employing a novel communication strategy....

    [...]

Book ChapterDOI
07 Aug 2006
TL;DR: Experimental results using six test functions demonstrate that CSO has much better performance than Particle Swarm Optimization (PSO).
Abstract: In this paper, we present a new algorithm of swarm intelligence, namely, Cat Swarm Optimization (CSO). CSO is generated by observing the behaviors of cats, and composed of two sub-models, i.e., tracing mode and seeking mode, which model upon the behaviors of cats. Experimental results using six test functions demonstrate that CSO has much better performance than Particle Swarm Optimization (PSO).

316 citations


Cites background from "A Parallel Particle Swarm Optimizat..."

  • ...Genetic Algorithm (GA) [1-2], Ant Colony Optimization (ACO) [6-7], Particle Swarm Optimization (PSO) [3-5], and Simulated Annealing (SA) [8-9] etc....

    [...]

References
More filters
Book
01 Sep 1988
TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Abstract: From the Publisher: This book brings together - in an informal and tutorial fashion - the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required

52,797 citations

Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations

Proceedings ArticleDOI
04 Oct 1995
TL;DR: The optimization of nonlinear functions using particle swarm methodology is described and implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm.
Abstract: The optimization of nonlinear functions using particle swarm methodology is described. Implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm. Benchmark testing of both paradigms is described, and applications, including neural network training and robot task learning, are proposed. Relationships between particle swarm optimization and both artificial life and evolutionary computation are reviewed.

14,477 citations

Proceedings ArticleDOI
04 May 1998
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through "genetic" operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are "evolved" by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer.

9,373 citations

Journal ArticleDOI
TL;DR: This paper analyzes a particle's trajectory as it moves in discrete time, then progresses to the view of it in continuous time, leading to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies.
Abstract: The particle swarm is an algorithm for finding optimal regions of complex search spaces through the interaction of individuals in a population of particles. This paper analyzes a particle's trajectory as it moves in discrete time (the algebraic view), then progresses to the view of it in continuous time (the analytical view). A five-dimensional depiction is developed, which describes the system completely. These analyses lead to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies. Some results of the particle swarm optimizer, implementing modifications derived from the analysis, suggest methods for altering the original algorithm in ways that eliminate problems and increase the ability of the particle swarm to find optima of some well-studied test functions.

8,287 citations


Additional excerpts

  • ...JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 21, 809-818 (2005) 809 Short Paper_________________________________________________...

    [...]