Enhancing particle swarm optimization using generalized opposition-based learning
Citations
1,091 citations
Cites methods from "Enhancing particle swarm optimizati..."
...Wang et al. (2011) proposed an iterative multi-objective particle swarmoptimization-based control vector parameterization to cope with the dynamic optimization of the state constrained chemical and biochemical engineering problems....
[...]
...…operators to the PSO algorithm, such as selection (Angeline 1998a, b; Lovbjerg et al. 2001), crossover (Angeline 1998b; Chen et al. 2014),mutation (Tsafarakis et al. 2013) or Cauchy mutation (Wang et al. 2011) to increase the diversity and improve its ability to escape from the local minima....
[...]
836 citations
Cites methods from "Enhancing particle swarm optimizati..."
...[74] presented an enhanced PSO algorithm called GOPSO, which employed generalized OBL (GOBL) and Cauchy mutation....
[...]
382 citations
366 citations
Cites background or methods from "Enhancing particle swarm optimizati..."
...Functions D CPSO-H [1] CLPSO [34] APSO [59] GOPSO [53] DNSCLPSO DNSPSO...
[...]
...These problems were utilized in previous studies [34,53]....
[...]
...Functions CPSO-H [1] CLPSO [34] APSO [59] GOPSO [53] DNSCLPSO DNSPSO...
[...]
...Functions CLPSO [34] GOPSO [53] MA-SW-Chains [39] DNSCLPSO DNSPSO...
[...]
...For D = 100, MAX_FEs is set to 5000 D [53]....
[...]
284 citations
References
35,104 citations
"Enhancing particle swarm optimizati..." refers methods in this paper
...Shi and Eberhart [30] introduced a parameter called inertia weight w for the original PSO....
[...]
...Particle swarm optimization (PSO) is a relatively new optimization technique, which was developed by Kennedy and Eberhart [17]....
[...]
...The inertia factor w was proposed by Shi and Eberhart [30], rand1ij and rand2ij are two random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1,2, . . . indicates the iteration number....
[...]
16,450 citations
10,306 citations
"Enhancing particle swarm optimizati..." refers methods in this paper
...To compare the performance differences between GOPSO and the other eight PSO algorithms, we conduct a Wilcoxon signed-ranks test [9,14]....
[...]
9,373 citations
"Enhancing particle swarm optimizati..." refers background or methods in this paper
...PSO with inertia weight (PSO-w) [30]; PSO with constriction factor (PSO-cf) [7]; unified PSO (UPSO) [24]; fitness-distance-ratio based PSO (FDR-PSO) [25]; fully informed particle swarm (FIPS) [22]; cooperative PSO (CPSO-H) [1]; comprehensive learning PSO (CLPSO) [19]; opposition-based PSO with Cauchy mutation (OPSO) [37]; our approach (GOPSO)....
[...]
...The inertia factor w was proposed by Shi and Eberhart [30], rand1ij and rand2ij are two random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1,2, . . . indicates the iteration number....
[...]
...Particle swarm optimization (PSO) is a relatively new optimization technique, which was developed by Kennedy and Eberhart [17]....
[...]
...The inertia factor w was proposed by Shi and Eberhart [30], rand1ij and rand2ij are two random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1,2, ....
[...]
...Shi and Eberhart [30] introduced a parameter called inertia weight w for the original PSO....
[...]