scispace - formally typeset
Search or ask a question
Topic

Multi-swarm optimization

About: Multi-swarm optimization is a research topic. Over the lifetime, 19162 publications have been published within this topic receiving 549725 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The original version uses fixed population size but a method for gradually reducing population size is proposed, which improves the efficiency and robustness of the algorithm and can be applied to any variant of a Differential Evolution algorithm.
Abstract: This paper studies the efficiency of a recently defined population-based direct global optimization method called Differential Evolution with self-adaptive control parameters The original version uses fixed population size but a method for gradually reducing population size is proposed in this paper It improves the efficiency and robustness of the algorithm and can be applied to any variant of a Differential Evolution algorithm The proposed modification is tested on commonly used benchmark problems for unconstrained optimization and compared with other optimization methods such as Evolutionary Algorithms and Evolution Strategies

320 citations

Journal ArticleDOI
TL;DR: This paper proposes a hybrid method, which combines P&O and PSO methods, and the advantage of using the proposed hybrid method is that the search space for the PSO is reduced, and hence, the time that is required for convergence can be greatly improved.
Abstract: Conventional maximum power point tracking (MPPT) methods such as perturb-and-observe (P&O) method can only track the first local maximum point and stop progressing to the next maximum point. MPPT methods based on particle swarm optimization (PSO) have been proposed to track the global maximum point (GMP). However, the problem with the PSO method is that the time required for convergence may be long if the range of the search space is large. This paper proposes a hybrid method, which combines P&O and PSO methods. Initially, the P&O method is employed to allocate the nearest local maximum. Then, starting from that point on, the PSO method is employed to search for the GMP. The advantage of using the proposed hybrid method is that the search space for the PSO is reduced, and hence, the time that is required for convergence can be greatly improved. The excellent performance of the proposed hybrid method is verified by comparing it against the PSO method using an experimental setup.

319 citations

Journal ArticleDOI
TL;DR: A distance-based locally informed particle swarm (LIPS) optimizer, which eliminates the need to specify any niching parameter and enhance the fine search ability of PSO.
Abstract: Multimodal optimization amounts to finding multiple global and local optima (as opposed to a single solution) of a function, so that the user can have a better knowledge about different optimal solutions in the search space and when needed, the current solution may be switched to a more suitable one while still maintaining the optimal system performance. Niching particle swarm optimizers (PSOs) have been widely used by the evolutionary computation community for solving real-parameter multimodal optimization problems. However, most of the existing PSO-based niching algorithms are difficult to use in practice because of their poor local search ability and requirement of prior knowledge to specify certain niching parameters. This paper has addressed these issues by proposing a distance-based locally informed particle swarm (LIPS) optimizer, which eliminates the need to specify any niching parameter and enhance the fine search ability of PSO. Instead of using the global best particle, LIPS uses several local bests to guide the search of each particle. LIPS can operate as a stable niching algorithm by using the information provided by its neighborhoods. The neighborhoods are estimated in terms of Euclidean distance. The algorithm is compared with a number of state-of-the-art evolutionary multimodal optimizers on 30 commonly used multimodal benchmark functions. The experimental results suggest that the proposed technique is able to provide statistically superior and more consistent performance over the existing niching algorithms on the test functions, without incurring any severe computational burdens.

319 citations

Journal ArticleDOI
TL;DR: The experimental results show the efficiency of the clustering PSO for locating and tracking multiple optima in dynamic environments in comparison with other particle swarm optimization models based on the multiswarm method.
Abstract: In the real world, many optimization problems are dynamic. This requires an optimization algorithm to not only find the global optimal solution under a specific environment but also to track the trajectory of the changing optima over dynamic environments. To address this requirement, this paper investigates a clustering particle swarm optimizer (PSO) for dynamic optimization problems. This algorithm employs a hierarchical clustering method to locate and track multiple peaks. A fast local search method is also introduced to search optimal solutions in a promising subregion found by the clustering method. Experimental study is conducted based on the moving peaks benchmark to test the performance of the clustering PSO in comparison with several state-of-the-art algorithms from the literature. The experimental results show the efficiency of the clustering PSO for locating and tracking multiple optima in dynamic environments in comparison with other particle swarm optimization models based on the multiswarm method.

317 citations

Book ChapterDOI
07 Aug 2006
TL;DR: Experimental results using six test functions demonstrate that CSO has much better performance than Particle Swarm Optimization (PSO).
Abstract: In this paper, we present a new algorithm of swarm intelligence, namely, Cat Swarm Optimization (CSO). CSO is generated by observing the behaviors of cats, and composed of two sub-models, i.e., tracing mode and seeking mode, which model upon the behaviors of cats. Experimental results using six test functions demonstrate that CSO has much better performance than Particle Swarm Optimization (PSO).

316 citations


Network Information
Related Topics (5)
Fuzzy logic
151.2K papers, 2.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
87% related
Support vector machine
73.6K papers, 1.7M citations
86% related
Artificial neural network
207K papers, 4.5M citations
85% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023183
2022471
202110
20207
201926
2018171