scispace - formally typeset
Search or ask a question
Topic

Multi-swarm optimization

About: Multi-swarm optimization is a research topic. Over the lifetime, 19162 publications have been published within this topic receiving 549725 citations.


Papers
More filters
Journal ArticleDOI
01 Jan 1986
TL;DR: GA's are shown to be effective for both levels of the systems optimization problem and are applied to the second level task of identifying efficient GA's for a set of numerical optimization problems.
Abstract: The task of optimizing a complex system presents at least two levels of problems for the system designer. First, a class of optimization algorithms must be chosen that is suitable for application to the system. Second, various parameters of the optimization algorithm need to be tuned for efficiency. A class of adaptive search procedures called genetic algorithms (GA) has been used to optimize a wide variety of complex systems. GA's are applied to the second level task of identifying efficient GA's for a set of numerical optimization problems. The results are validated on an image registration problem. GA's are shown to be effective for both levels of the systems optimization problem.

2,924 citations

Proceedings ArticleDOI
16 Jul 2000
TL;DR: It is concluded that the best approach is to use the constriction factor while limiting the maximum velocity Vmax to the dynamic range of the variable Xmax on each dimension.
Abstract: The performance of particle swarm optimization using an inertia weight is compared with performance using a constriction factor. Five benchmark functions are used for the comparison. It is concluded that the best approach is to use the constriction factor while limiting the maximum velocity Vmax to the dynamic range of the variable Xmax on each dimension. This approach provides performance on the benchmark functions superior to any other published results known by the authors.

2,922 citations

Journal ArticleDOI
TL;DR: A novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations to overcome the difficulties of selecting an appropriate mutation step size for different problems.
Abstract: This paper introduces a novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations. Initially, to efficiently control the local search and convergence to the global optimum solution, time-varying acceleration coefficients (TVAC) are introduced in addition to the time-varying inertia weight factor in particle swarm optimization (PSO). From the basis of TVAC, two new strategies are discussed to improve the performance of the PSO. First, the concept of "mutation" is introduced to the particle swarm optimization along with TVAC (MPSO-TVAC), by adding a small perturbation to a randomly selected modulus of the velocity vector of a random particle by predefined probability. Second, we introduce a novel particle swarm concept "self-organizing hierarchical particle swarm optimizer with TVAC (HPSO-TVAC)". Under this method, only the "social" part and the "cognitive" part of the particle swarm strategy are considered to estimate the new velocity of each particle and particles are reinitialized whenever they are stagnated in the search space. In addition, to overcome the difficulties of selecting an appropriate mutation step size for different problems, a time-varying mutation step size was introduced. Further, for most of the benchmarks, mutation probability is found to be insensitive to the performance of MPSO-TVAC method. On the other hand, the effect of reinitialization velocity on the performance of HPSO-TVAC method is also observed. Time-varying reinitialization step size is found to be an efficient parameter optimization strategy for HPSO-TVAC method. The HPSO-TVAC strategy outperformed all the methods considered in this investigation for most of the functions. Furthermore, it has also been observed that both the MPSO and HPSO strategies perform poorly when the acceleration coefficients are fixed at two.

2,753 citations

Journal ArticleDOI
TL;DR: The particle swarm optimization algorithm is analyzed using standard results from the dynamic system theory and graphical parameter selection guidelines are derived, resulting in results superior to previously published results.

2,554 citations

Book
01 Jan 1983
TL;DR: In this article, problem complexity and method efficiency in optimisation are discussed in terms of problem complexity, method efficiency, and method complexity in the context of OO optimization, respectively.
Abstract: (1984). Problem Complexity and Method Efficiency in Optimization. Journal of the Operational Research Society: Vol. 35, No. 5, pp. 455-455.

2,382 citations


Network Information
Related Topics (5)
Fuzzy logic
151.2K papers, 2.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
87% related
Support vector machine
73.6K papers, 1.7M citations
86% related
Artificial neural network
207K papers, 4.5M citations
85% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023183
2022471
202110
20207
201926
2018171