Diversity enhanced particle swarm optimization with neighborhood search
Citations
1,091 citations
Cites methods from "Diversity enhanced particle swarm o..."
...2015b), neighborhood search mechanism (Wang et al. 2013), collision-avoiding mech-...
[...]
...…2002), Bayesian optimization model (Monson and Seppi 2005), chemical reaction optimization (Li et al. 2015b), neighborhood search mechanism (Wang et al. 2013), collision-avoiding mechanism (Blackwell and Bentley 2002), information sharing mechanism (Li et al. 2015a), local search technique…...
[...]
836 citations
Additional excerpts
...[91] proposed a hybrid PSO algorithm called...
[...]
566 citations
532 citations
Cites methods from "Diversity enhanced particle swarm o..."
...An algorithm called “Diversity enhanced with Neighborhood Search Particle Swarm Optimization,” DNSPSO, was proposed (Wang et al., 2013) in which the explorative behavior was controlled by enhancing the diversity of the particles in the swarm....
[...]
427 citations
References
35,104 citations
24,053 citations
11,224 citations
"Diversity enhanced particle swarm o..." refers background in this paper
...In the past decades, several variant swarm intelligence-based algorithms have been proposed to solve complex benchmark and real-world optimization problems, e.g., Particle Swarm Optimization (PSO) [29], Ant Colony Optimization (ACO) [14], Artificial Bee Colony (ABC) [27], Cat Swarm Optimization [7], etc. Due to PSO’s simple concept, easy implementation yet effectiveness, it has become popular in evolutionary optimization community....
[...]
..., Particle Swarm Optimization (PSO) [29], Ant Colony Optimization (ACO) [14], Artificial Bee Colony (ABC) [27], Cat Swarm Optimization [7], etc....
[...]
10,306 citations
"Diversity enhanced particle swarm o..." refers methods in this paper
...To compare the performance differences among DNSPSO and the other four PSO algorithms, we conduct a Wilcoxon signed-rank test [12,19]....
[...]
9,373 citations
"Diversity enhanced particle swarm o..." refers background or methods in this paper
...Shi and Eberhart [43] introduced a parameter called inertia weight w for the classical PSO....
[...]
...Hu and Eberhart [25] used dynamic neighborhood PSO to solve multi-objective optimization problems....
[...]
...The parameter w, called inertia factor, which is used to balance the global and local search abilities of particles [43], rand1ij and rand2ij are two uniform random numbers generated independently within the range of [0,1], c1 and c2 are two learning factors which control the influence of the social and cognitive components, and t = 1, 2, ....
[...]
...Hu and Eberhart [25] updated the neighborhood of each particle by dynamically selecting m particles that are the nearest to the current particle....
[...]
...During a search process, each particle is attracted by its previous best particle (pbest) and the global best particle (gbest) as follows [43]....
[...]