scispace - formally typeset
Search or ask a question
Topic

Multi-swarm optimization

About: Multi-swarm optimization is a research topic. Over the lifetime, 19162 publications have been published within this topic receiving 549725 citations.


Papers
More filters
Book ChapterDOI
Yaochu Jin1, Bernhard Sendhoff1
TL;DR: This paper suggests a method for constructing dynamic optimization test problems using multi-objective optimization (MOO) concepts that is computationally efficient, easily tunable and functionally powerful.
Abstract: Dynamic optimization using evolutionary algorithms is receiving increasing interests. However, typical test functions for comparing the performance of various dynamic optimization algorithms still lack. This paper suggests a method for constructing dynamic optimization test problems using multi-objective optimization (MOO) concepts. By aggregating different objectives of an MOO problem and changing the weights dynamically, we are able to construct dynamic single objective and multi-objective test problems systematically. The proposed method is computationally efficient, easily tunable and functionally powerful. This is mainly due to the fact that the proposed method associates dynamic optimization with multi-objective optimization and thus the rich MOO test problems can easily be adapted to dynamic optimization test functions.

133 citations

Book ChapterDOI
01 Jan 2001
TL;DR: The Particle Swarm Optimization method is modified in order to locate and evaluate all the global minima of an objective function and separates the swarm properly when a candidate minimizer is detected.
Abstract: In many optimization applications, escaping from the local minima as well as computing all the global minima of an objective function is of vital importance. In this paper the Particle Swarm Optimization method is modified in order to locate and evaluate all the global minima of an objective function. The new approach separates the swarm properly when a candidate minimizer is detected. This technique can also be used for escaping from the local minima which is very important in neural network training.

133 citations

Journal ArticleDOI
TL;DR: From the computational results, it can conclude that the large-step optimization methods outperform the simulated annealing method and find more frequently an optimal schedule than the other studied methods.

133 citations

Book ChapterDOI
01 Jan 1993
TL;DR: A stochastic search procedure that is the basis of genetic algorithms (GA), is described, in developing near-optimal topologies of load bearing truss structures, an adaptation of the ground-structure method of topology optimization.
Abstract: The present paper describes the use of a stochastic search procedure that is the basis of genetic algorithms (GA), in developing near-optimal topologies of load bearing truss structures The problem addressed is one wherein the structural geometry is created from a specification of load conditions and available support points in the design space The development of this geometry must satisfy kinematic stability requirements in addition to the usual requirements of structural strength and stiffness The approach is an adaptation of the ground-structure method of topology optimization, and is implemented in a two-level GA based search In this process, the kinematic stability constraints are imposed at one level, followed by the treatment of response constraints at a second level of optimization Singular value decomposition is used to assess the kinematic stability constraint at the first level of design, and results in the creation of a finite number of increasing weight, stable topologies Member sizing is then introduced at a second level of design, where minimal weight and response constraints are simultaneously considered At this level, the only admissible topologies are those identified during the first stage and any stable combinations thereof The design variable representation scheme allows for both the removal and addition of structural members during optimization

132 citations

Proceedings ArticleDOI
20 Oct 2008
TL;DR: The application of the new generating strategy in PSO can efficiently improve the global searching capability and escape from local minima and could obtain the global optimum solution more probably.
Abstract: This paper presents a new approach via multi-particle swarm optimization (MPSO) to solve the unit commitment (UC) problem. A new strategy which can generate feasible particles and make the search space narrow within the feasible solutions is presented. Some particle swarms are generated by the new strategy, and location optimum solutions are searched in each particle swarm, then a new particle swarm is made up of location optimum solutions, and the global optimum solution is searching in this new particle swarm. The application of the new generating strategy in PSO can efficiently improve the global searching capability and escape from local minima. The simulation results show that the method is more efficient than genetic algorithm, and could obtain the global optimum solution more probably.

132 citations


Network Information
Related Topics (5)
Fuzzy logic
151.2K papers, 2.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
87% related
Support vector machine
73.6K papers, 1.7M citations
86% related
Artificial neural network
207K papers, 4.5M citations
85% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023183
2022471
202110
20207
201926
2018171