scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Enhancing particle swarm optimization using generalized opposition-based learning

TL;DR: An enhanced PSO algorithm called GOPSO is presented, which employs generalized opposition-based learning (GOBL) and Cauchy mutation to overcome the problem of premature convergence when solving complex problems.
About: This article is published in Information Sciences.The article was published on 2011-10-01. It has received 384 citations till now. The article focuses on the topics: Multi-swarm optimization & Particle swarm optimization.
Citations
More filters
Journal ArticleDOI
01 Jan 2018
TL;DR: Its origin and background is introduced and the theory analysis of the PSO is carried out, which analyzes its present situation of research and application in algorithm structure, parameter selection, topology structure, discrete PSO algorithm and parallel PSO algorithms, multi-objective optimization PSO and its engineering applications.
Abstract: Particle swarm optimization (PSO) is a population-based stochastic optimization algorithm motivated by intelligent collective behavior of some animals such as flocks of birds or schools of fish. Since presented in 1995, it has experienced a multitude of enhancements. As researchers have learned about the technique, they derived new versions aiming to different demands, developed new applications in a host of areas, published theoretical studies of the effects of the various parameters and proposed many variants of the algorithm. This paper introduces its origin and background and carries out the theory analysis of the PSO. Then, we analyze its present situation of research and application in algorithm structure, parameter selection, topology structure, discrete PSO algorithm and parallel PSO algorithm, multi-objective optimization PSO and its engineering applications. Finally, the existing problems are analyzed and future research directions are presented.

1,091 citations


Cites methods from "Enhancing particle swarm optimizati..."

  • ...Wang et al. (2011) proposed an iterative multi-objective particle swarmoptimization-based control vector parameterization to cope with the dynamic optimization of the state constrained chemical and biochemical engineering problems....

    [...]

  • ...…operators to the PSO algorithm, such as selection (Angeline 1998a, b; Lovbjerg et al. 2001), crossover (Angeline 1998b; Chen et al. 2014),mutation (Tsafarakis et al. 2013) or Cauchy mutation (Wang et al. 2011) to increase the diversity and improve its ability to escape from the local minima....

    [...]

Journal ArticleDOI
TL;DR: This survey presented a comprehensive investigation of PSO, including its modifications, extensions, and applications to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology.
Abstract: Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms.

836 citations


Cites methods from "Enhancing particle swarm optimizati..."

  • ...[74] presented an enhanced PSO algorithm called GOPSO, which employed generalized OBL (GOBL) and Cauchy mutation....

    [...]

Journal ArticleDOI
TL;DR: The paper mainly covers the fundamental algorithmic frameworks such as decomposition and non-decomposition methods, and their current applications in the field of large-scale global optimization.

382 citations

Journal ArticleDOI
TL;DR: A hybrid PSO algorithm is proposed, called DNSPSO, which employs a diversity enhancing mechanism and neighborhood search strategies to achieve a trade-off between exploration and exploitation abilities.

366 citations


Cites background or methods from "Enhancing particle swarm optimizati..."

  • ...Functions D CPSO-H [1] CLPSO [34] APSO [59] GOPSO [53] DNSCLPSO DNSPSO...

    [...]

  • ...These problems were utilized in previous studies [34,53]....

    [...]

  • ...Functions CPSO-H [1] CLPSO [34] APSO [59] GOPSO [53] DNSCLPSO DNSPSO...

    [...]

  • ...Functions CLPSO [34] GOPSO [53] MA-SW-Chains [39] DNSCLPSO DNSPSO...

    [...]

  • ...For D = 100, MAX_FEs is set to 5000 D [53]....

    [...]

Journal ArticleDOI
15 Mar 2016-Energy
TL;DR: The performance of GOTLBO is comprehensively evaluated in thirteen benchmark functions and two parameter identification problems of solar cell models, i.e., single diode model and double diode models.

284 citations

References
More filters
Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations


"Enhancing particle swarm optimizati..." refers methods in this paper

  • ...Shi and Eberhart [30] introduced a parameter called inertia weight w for the original PSO....

    [...]

  • ...Particle swarm optimization (PSO) is a relatively new optimization technique, which was developed by Kennedy and Eberhart [17]....

    [...]

  • ...The inertia factor w was proposed by Shi and Eberhart [30], rand1ij and rand2ij are two random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1,2, . . . indicates the iteration number....

    [...]

Journal Article
TL;DR: A set of simple, yet safe and robust non-parametric tests for statistical comparisons of classifiers is recommended: the Wilcoxon signed ranks test for comparison of two classifiers and the Friedman test with the corresponding post-hoc tests for comparisons of more classifiers over multiple data sets.
Abstract: While methods for comparing two learning algorithms on a single data set have been scrutinized for quite some time already, the issue of statistical tests for comparisons of more algorithms on multiple data sets, which is even more essential to typical machine learning studies, has been all but ignored. This article reviews the current practice and then theoretically and empirically examines several suitable tests. Based on that, we recommend a set of simple, yet safe and robust non-parametric tests for statistical comparisons of classifiers: the Wilcoxon signed ranks test for comparison of two classifiers and the Friedman test with the corresponding post-hoc tests for comparison of more classifiers over multiple data sets. Results of the latter can also be neatly presented with the newly introduced CD (critical difference) diagrams.

10,306 citations


"Enhancing particle swarm optimizati..." refers methods in this paper

  • ...To compare the performance differences between GOPSO and the other eight PSO algorithms, we conduct a Wilcoxon signed-ranks test [9,14]....

    [...]

Proceedings ArticleDOI
04 May 1998
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through "genetic" operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are "evolved" by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer.

9,373 citations


"Enhancing particle swarm optimizati..." refers background or methods in this paper

  • ...PSO with inertia weight (PSO-w) [30]; PSO with constriction factor (PSO-cf) [7]; unified PSO (UPSO) [24]; fitness-distance-ratio based PSO (FDR-PSO) [25]; fully informed particle swarm (FIPS) [22]; cooperative PSO (CPSO-H) [1]; comprehensive learning PSO (CLPSO) [19]; opposition-based PSO with Cauchy mutation (OPSO) [37]; our approach (GOPSO)....

    [...]

  • ...The inertia factor w was proposed by Shi and Eberhart [30], rand1ij and rand2ij are two random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1,2, . . . indicates the iteration number....

    [...]

  • ...Particle swarm optimization (PSO) is a relatively new optimization technique, which was developed by Kennedy and Eberhart [17]....

    [...]

  • ...The inertia factor w was proposed by Shi and Eberhart [30], rand1ij and rand2ij are two random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1,2, ....

    [...]

  • ...Shi and Eberhart [30] introduced a parameter called inertia weight w for the original PSO....

    [...]