# Particle swarm hybridized with differential evolution: black box optimization benchmarking for noisy functions

## Summary (1 min read)

### 1. INTRODUCTION

- Sections 3 and 4 presents the experimentation procedure and the results obtained, respectively.
- Finally, conclusions and remarks are given in Section 6.

### 2. THE ALGORITHM: DEPSO

- Algorithm 1 shows pseudocode of the hybrid DEPSO algorithm developed for this work.
- First, an initialization process of all particles in the swarm S (as stated in [6] ), and their initial evaluation (line 1) are carried out.
- After this, each evolution step the particle's positions are updated following the differential variation model of the equations previously explained (lines 4 to 18).
- In addition, the global best position reached at the moment is updated in order to guide the rest of the swarm.
- Finally, the algorithm returns the best solution found during the whole process.

### 3. EXPERIMENTAL PROCEDURE

- The authors proposal was tested performing 15 independent runs for each noisy function and each dimension.
- Table 1 shows the parameter setting used to configure DEPSO.
- These parameters were tuned in the context of the special session of CEC'05 for real parameter optimization [11, 5] reaching results statistically similar to the best participant algorithms (G-CMA-ES [2] and K-PCX [10]) in that session.
- This parameterization was kept the same for all the experiments, and therefore the crafting effort [6] is zero.

### 5. CPU TIMING EXPERIMENT

- For the timing experiment, the same DEPSO algorithm was run on f8 until at least 30 seconds had passed (according to the exampletiming procedure available in BBOB 2009 [6] ).
- These experiments have been conducted with an Intel(R) Corel(TM)2 CPU processor with 1.66 GHz and 1GB RAM; O.S Linux Ubuntu version 8.10 using the C-code provided.

### 6. CONCLUSION

- A simple and easy to implement optimization algorithm constructed by hybridizing the Particle Swarm Optimizer with Differential Evolution operations.the authors.
- The experiments have been made in the context of the special session of real parameter Black-Box Optimization Benchmarking (GECCO BBOB 2009), performing the complete procedure previously established, and dealing with noisy functions with dimension: 2, 3, 5, 10, 20, and 40 variables.
- The authors proposal obtained an accurate level of coverage rate for dimensions 2, 3, 5, and 10, specifically with moderate noise and severe noise multimodal functions.
- The fact of using the same parameter setting for all functions (and dimensions), together with the relatively small number of function evaluations used (1000 × DIM ), leads us to think that DEPSO can be easily improved for better covering noise functions with higher dimensions.

Did you find this useful? Give us your feedback

##### Citations

246 citations

### Cites background or methods from "Particle swarm hybridized with diff..."

...[50] evaluated a Particle Swarm Optimizer hybridized with Differential Evolution and applied it to the Black-Box Optimization Benchmarking for noisy functions....

[...]

...[50] 2009 DEPSO Noisy functions Zhang et al....

[...]

59 citations

42 citations

### Cites methods from "Particle swarm hybridized with diff..."

...PSO variants, i.e. PSO_Bounds (El-Abd and Kamel, 2009b), EDA-PSO (El-Abd and Kamel, 2009a), DE-PSO (García-Nieto et al., 2009), and other methods such as DASA (Korošec and Šilc, 2009) and BayEDAcG (Gallagher, 2009) embed diverse search strategies to overcome local optima traps and show competitive performance in tackling BBOB problems....

[...]

...Other selected methods such as DE-PSO, SNES and BayEDAcG employ the maximum of 104D number of function evaluations....

[...]

...PSO variants, i.e. PSO_Bounds (El-Abd and Kamel, 2009b), EDA-PSO (El-Abd and Kamel, 2009a), DE-PSO (García-Nieto et al., 2009), and other methods such as DASA (Korošec and Šilc, 2009) and BayEDAcG (Gallagher, 2009) embed diverse search strategies to overcome local optima traps and show competitive…...

[...]

...The selected methods include the best 2009 optimizer (provided automatically by the COCO platform) (Hansen et al., 2010a), Separable Natural Evolution Strategies (SNES) (Schaul, 2012a), Exponential NES (xNES) (Schaul, 2012b), xNES with Adaptation Sampling (xNESas) (Schaul, 2012c), Differential Ant-Stigmergy Algorithm (DASA) (Korošec and Šilc, 2009), Simultaneous Perturbation Stochastic Approximation (SPSA) (Finck and Beyer, 2010), PSO hybridized with Estimation of Distribution Algorithm (EDA) (EDA-PSO) (El-Abd and Kamel, 2009a), PSO with adaptive bounds (PSO_Bounds) (El-Abd and Kamel, 2009b), PSO incorporated with DE (DE-PSO) (García-Nieto et al., 2009), BayEDAcG (Gallagher, 2009) and the Pure-Random-Search algorithm (RANDOMSEARCH) (Auger and Ros, 2009)....

[...]

...…of Distribution Algorithm (EDA) (EDA-PSO) (El-Abd and Kamel, 2009a), PSO with adaptive bounds (PSO_Bounds) (El-Abd and Kamel, 2009b), PSO incorporated with DE (DE-PSO) (García-Nieto et al., 2009), BayEDAcG (Gallagher, 2009) and the Pure-Random-Search algorithm (RANDOMSEARCH) (Auger and Ros, 2009)....

[...]

15 citations

### Cites background from "Particle swarm hybridized with diff..."

...The basic idea of DE is to take the difference vector between two individuals and add a scaled version of the difference vector (a mutant vector) to a third individual to create a new trial vector to update each individual [5]....

[...]

8 citations

##### References

35,104 citations

5,607 citations

### "Particle swarm hybridized with diff..." refers methods in this paper

...Particle Swarm Optimization (PSO) [8] and Differential Evolution (DE) [9] have been successfully used on real parameter function optimization since they are two well adapted algorithms for continuous solution encoding....

[...]

...This mechanism is used to increase the exploitation ability of the algorithm through the search space [9]....

[...]

4,273 citations

2,989 citations

### "Particle swarm hybridized with diff..." refers methods in this paper

...These parameters were tuned in the context of the special session of CEC’05 for real parameter optimization [11, 5] reaching results statistically similar to the best participant algorithms (G-CMA-ES [2] and K-PCX [10]) in that session....

[...]

^{1}

961 citations

### "Particle swarm hybridized with diff..." refers methods in this paper

...These parameters were tuned in the context of the special session of CEC’05 for real parameter optimization [11, 5] reaching results statistically similar to the best participant algorithms (G-CMA-ES [2] and K-PCX [10]) in that session....

[...]