scispace - formally typeset
Search or ask a question
Topic

Multi-swarm optimization

About: Multi-swarm optimization is a research topic. Over the lifetime, 19162 publications have been published within this topic receiving 549725 citations.


Papers
More filters
Proceedings ArticleDOI
06 Oct 2002
TL;DR: This paper introduces a new Particle Swarm Optimisation (PSO) algorithm with strong local convergence properties, which performs much better with a smaller number of particles, compared to the original PSO.
Abstract: This paper introduces a new Particle Swarm Optimisation (PSO) algorithm with strong local convergence properties. The new algorithm performs much better with a smaller number of particles, compared to the original PSO. This property is desirable when designing a niching PSO algorithm.

428 citations

Journal ArticleDOI
TL;DR: The results indicate that the particle swarm optimization algorithm does locate the constrained minimum de-sign in continuous applications with very good precision, albeit at a much highercomputational cost than that of a typical gradient based optimizer.
Abstract: Gerhard Venter (gventer_vrand.conl) *Vanderpla(ds Research and Development, bit.1767 S 8th St'reef. Suite 100, Colorado Springs. CO 80906Jaroslaw Sobieszczanski-Sobieski (j.sobieski:_larc.nasa.gov) *A_4SA Lcmgley Research Ce,_terMS 240, Hampton, I:4 23681-2199The purpose of this paper is to show how the search algorithm, known as par-ticle swarm optimization performs. Here, particle swarm optimization ks appliedto structural design problems, but the method.has a much wider range of possi-ble applications. The paper's new contributions are improvements to the particleswarm optimization algorithm and conclusions and recommendations as to theutility of the algorithm. Results of numerical experiments for both continuousand discrete applications are presented in the paper. The results indicate that theparticle swarm optimization algorithm does locate the constrained minimum de-sign in continuous applications with very good precision, albeit at a much highercomputational cost than that of a typical gradient based optimizer. However, thetrue potential of particle swarm optimization is primarily in applications withdiscrete and/or discontinuous functions and variables. Additionally, particleswarm optimization has the potential of e3_icient computation with very largenumbers of concurrently operating processors.

428 citations

Journal ArticleDOI
TL;DR: A novel swarm algorithm called the Social Spider Optimization (SSO) is proposed for solving optimization tasks based on the simulation of cooperative behavior of social-spiders, and is compared to other well-known evolutionary methods.
Abstract: Swarm intelligence is a research field that models the collective behavior in swarms of insects or animals. Several algorithms arising from such models have been proposed to solve a wide range of complex optimization problems. In this paper, a novel swarm algorithm called the Social Spider Optimization (SSO) is proposed for solving optimization tasks. The SSO algorithm is based on the simulation of cooperative behavior of social-spiders. In the proposed algorithm, individuals emulate a group of spiders which interact to each other based on the biological laws of the cooperative colony. The algorithm considers two different search agents (spiders): males and females. Depending on gender, each individual is conducted by a set of different evolutionary operators which mimic different cooperative behaviors that are typically found in the colony. In order to illustrate the proficiency and robustness of the proposed approach, it is compared to other well-known evolutionary methods. The comparison examines several standard benchmark functions that are commonly considered within the literature of evolutionary algorithms. The outcome shows a high performance of the proposed method for searching a global optimum with several benchmark functions.

427 citations

Book ChapterDOI
01 Jan 2008
TL;DR: This chapter provides two recent algorithms for evolutionary optimization – well known as particle swarm optimization (PSO) and differential evolution (DE), inspired by biological and sociological motivations and can take care of optimality on rough, discontinuous and multimodal surfaces.
Abstract: Since the beginning of the nineteenth century, a significant evolution in optimization theory has been noticed. Classical linear programming and traditional non-linear optimization techniques such as Lagrange’s Multiplier, Bellman’s principle and Pontyagrin’s principle were prevalent until this century. Unfortunately, these derivative based optimization techniques can no longer be used to determine the optima on rough non-linear surfaces. One solution to this problem has already been put forward by the evolutionary algorithms research community. Genetic algorithm (GA), enunciated by Holland, is one such popular algorithm. This chapter provides two recent algorithms for evolutionary optimization – well known as particle swarm optimization (PSO) and differential evolution (DE). The algorithms are inspired by biological and sociological motivations and can take care of optimality on rough, discontinuous and multimodal surfaces. The chapter explores several schemes for controlling the convergence behaviors of PSO and DE by a judicious selection of their parameters. Special emphasis is given on the hybridizations of PSO and DE algorithms with other soft computing tools. The article finally discusses the mutual synergy of PSO with DE leading to a more powerful global search algorithm and its practical applications.

426 citations

Journal ArticleDOI
TL;DR: A broad review on SI dynamic optimization (SIDO) focused on several classes of problems, such as discrete, continuous, constrained, multi-objective and classification problems, and real-world applications, and some considerations about future directions in the subject are given.
Abstract: Swarm intelligence (SI) algorithms, including ant colony optimization, particle swarm optimization, bee-inspired algorithms, bacterial foraging optimization, firefly algorithms, fish swarm optimization and many more, have been proven to be good methods to address difficult optimization problems under stationary environments. Most SI algorithms have been developed to address stationary optimization problems and hence, they can converge on the (near-) optimum solution efficiently. However, many real-world problems have a dynamic environment that changes over time. For such dynamic optimization problems (DOPs), it is difficult for a conventional SI algorithm to track the changing optimum once the algorithm has converged on a solution. In the last two decades, there has been a growing interest of addressing DOPs using SI algorithms due to their adaptation capabilities. This paper presents a broad review on SI dynamic optimization (SIDO) focused on several classes of problems, such as discrete, continuous, constrained, multi-objective and classification problems, and real-world applications. In addition, this paper focuses on the enhancement strategies integrated in SI algorithms to address dynamic changes, the performance measurements and benchmark generators used in SIDO. Finally, some considerations about future directions in the subject are given.

421 citations


Network Information
Related Topics (5)
Fuzzy logic
151.2K papers, 2.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
87% related
Support vector machine
73.6K papers, 1.7M citations
86% related
Artificial neural network
207K papers, 4.5M citations
85% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023183
2022471
202110
20207
201926
2018171