scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Differential evolution algorithm with ensemble of parameters and mutation strategies

TL;DR: The performance of EPSDE is evaluated on a set of bound-constrained problems and is compared with conventional DE and several state-of-the-art parameter adaptive DE variants.
Abstract: Differential evolution (DE) has attracted much attention recently as an effective approach for solving numerical optimization problems. However, the performance of DE is sensitive to the choice of the mutation strategy and associated control parameters. Thus, to obtain optimal performance, time-consuming parameter tuning is necessary. Different mutation strategies with different parameter settings can be appropriate during different stages of the evolution. In this paper, we propose to employ an ensemble of mutation strategies and control parameters with the DE (EPSDE). In EPSDE, a pool of distinct mutation strategies along with a pool of values for each control parameter coexists throughout the evolution process and competes to produce offspring. The performance of EPSDE is evaluated on a set of bound-constrained problems and is compared with conventional DE and several state-of-the-art parameter adaptive DE variants.
Citations
More filters
Journal ArticleDOI
TL;DR: A detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far are presented.
Abstract: Differential evolution (DE) is arguably one of the most powerful stochastic real-parameter optimization algorithms in current use. DE operates through similar computational steps as employed by a standard evolutionary algorithm (EA). However, unlike traditional EAs, the DE-variants perturb the current-generation population members with the scaled differences of randomly selected and distinct population members. Therefore, no separate probability distribution has to be used for generating the offspring. Since its inception in 1995, DE has drawn the attention of many researchers all over the world resulting in a lot of variants of the basic algorithm with improved performance. This paper presents a detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far. Also, it provides an overview of the significant engineering applications that have benefited from the powerful nature of DE.

4,321 citations

Journal ArticleDOI
TL;DR: It is found that it is a high time to provide a critical review of the latest literatures published and also to point out some important future avenues of research on DE.
Abstract: Differential Evolution (DE) is arguably one of the most powerful and versatile evolutionary optimizers for the continuous parameter spaces in recent times. Almost 5 years have passed since the first comprehensive survey article was published on DE by Das and Suganthan in 2011. Several developments have been reported on various aspects of the algorithm in these 5 years and the research on and with DE have now reached an impressive state. Considering the huge progress of research with DE and its applications in diverse domains of science and technology, we find that it is a high time to provide a critical review of the latest literatures published and also to point out some important future avenues of research. The purpose of this paper is to summarize and organize the information on these current developments on DE. Beginning with a comprehensive foundation of the basic DE family of algorithms, we proceed through the recent proposals on parameter adaptation of DE, DE-based single-objective global optimizers, DE adopted for various optimization scenarios including constrained, large-scale, multi-objective, multi-modal and dynamic optimization, hybridization of DE with other optimizers, and also the multi-faceted literature on applications of DE. The paper also presents a dozen of interesting open problems and future research issues on DE.

1,265 citations


Cites methods from "Differential evolution algorithm wi..."

  • ...We begin with a very competitive method, called DE with Ensemble of Parameters and mutation Strategies (EPSDE) [120] to select the individual-specific strategy from a pool of mutation variants as well as values of F and Cr from sets of discrete candidate values within certain ranges....

    [...]

  • ...EPSDE has since then been enhanced by integrating SaDE-based adaptation [150,151]....

    [...]

  • ...Recently, [117] proposed a variant of EPSDE where the ensemble of F and Cr values are evolved by using the optimization process of another metaheuristic algorithm called Harmony Search (HS)....

    [...]

  • ...EPSDE exhibited highly competitive performance on the IEEE CEC 2005 benchmark suite for real parameter optimization....

    [...]

Journal ArticleDOI
TL;DR: A novel method, called composite DE (CoDE), has been proposed, which uses three trial vector generation strategies and three control parameter settings and randomly combines them to generate trial vectors.
Abstract: Trial vector generation strategies and control parameters have a significant influence on the performance of differential evolution (DE). This paper studies whether the performance of DE can be improved by combining several effective trial vector generation strategies with some suitable control parameter settings. A novel method, called composite DE (CoDE), has been proposed in this paper. This method uses three trial vector generation strategies and three control parameter settings. It randomly combines them to generate trial vectors. CoDE has been tested on all the CEC2005 contest test instances. Experimental results show that CoDE is very competitive.

1,207 citations


Cites background or methods from "Differential evolution algorithm wi..."

  • ...CoDE differs from EPSDE [18] in the following major aspects....

    [...]

  • ...To the best of our knowledge, EPSDE [18] was the first attempt to provide a systematic framework for combining different trial vector generation strategies with different control parameter settings....

    [...]

  • ...[18] very recently proposed an ensemble of trial vector generation strategies and control parameters of DE (EPSDE)....

    [...]

  • ..., JADE [12], jDE [16], SaDE [4], and EPSDE [18]....

    [...]

Proceedings ArticleDOI
06 Jul 2014
TL;DR: L-SHADE is proposed, which further extends SHADE with Linear Population Size Reduction (LPSR), which continually decreases the population size according to a linear function and is quite competitive with state-of-the-art evolutionary algorithms.
Abstract: SHADE is an adaptive DE which incorporates success-history based parameter adaptation and one of the state-of-the-art DE algorithms. This paper proposes L-SHADE, which further extends SHADE with Linear Population Size Reduction (LPSR), which continually decreases the population size according to a linear function. We evaluated the performance of L-SHADE on CEC2014 benchmarks and compared its search performance with state-of-the-art DE algorithms, as well as the state-of-the-art restart CMA-ES variants. The experimental results show that L-SHADE is quite competitive with state-of-the-art evolutionary algorithms.

1,048 citations


Cites background or methods from "Differential evolution algorithm wi..."

  • ...1 and also outperforms previous DE variants, including JADE [5], CoDE [18], EPSDE [6], SaDE [4], and dynNP-jDE [15]....

    [...]

  • ...Since this is a significant problem in practice, adaptive mechanisms for adjusting the control parameters on-line during the search process have been studied by many researchers [3]–[6]....

    [...]

  • ...We compared L-SHADE with the state-of-the-art DE algorithms SHADE 1.1 [9], CoDE [18], EPSDE [6], SaDE [4], JADE [5], and dynNP-jDE [15] on the CEC2014 benchmarks....

    [...]

  • ...We show experimentally that L-SHADE significantly improves upon the performance of SHADE 1.1 and also outperforms previous DE variants, including JADE [5], CoDE [18], EPSDE [6], SaDE [4], and dynNP-jDE [15]....

    [...]

  • ...These were used for the experiments in [18], and are based on code originally received from the original authors of CoDE, EPSDE, SaDE and JADE....

    [...]

Journal ArticleDOI
TL;DR: A fresh treatment is introduced that classifies and discusses existing work within three rational aspects: what and how EA components contribute to exploration and exploitation; when and how Exploration and exploitation are controlled; and how balance between exploration and exploited is achieved.
Abstract: “Exploration and exploitation are the two cornerstones of problem solving by search.” For more than a decade, Eiben and Schippers' advocacy for balancing between these two antagonistic cornerstones still greatly influences the research directions of evolutionary algorithms (EAs) [1998]. This article revisits nearly 100 existing works and surveys how such works have answered the advocacy. The article introduces a fresh treatment that classifies and discusses existing work within three rational aspects: (1) what and how EA components contribute to exploration and exploitation; (2) when and how exploration and exploitation are controlled; and (3) how balance between exploration and exploitation is achieved. With a more comprehensive and systematic understanding of exploration and exploitation, more research in this direction may be motivated and refined.

1,029 citations


Cites background or methods from "Differential evolution algorithm wi..."

  • ...This is due to the use of unconventional concepts (e.g., concepts of age and compact GA) or blending different approaches (e.g., ensembles [Mallipeddi et al. 2011])....

    [...]

  • ...… Grefenstette 1993; Ronald 1995; Joan-Arinyo et al. 2011] Hybrid [Ghosh et al. 1996; Harik et al. 1999; Paenke et al. 2009; Qin et al. 2009; Lee et al. 2011; Mallipeddi et al. 2011] Maintenance (Niching) Fitness-based [Holland 1975; Goldberg and Richardson 1987; Smith et al. 1993; Yin and Germay…...

    [...]

References
More filters
Journal ArticleDOI
Rainer Storn1, Kenneth Price
TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Abstract: A new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented. By means of an extensive testbed it is demonstrated that the new method converges faster and with more certainty than many other acclaimed global optimization methods. The new method requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.

24,053 citations

Book
13 Dec 2005
TL;DR: This volume explores the differential evolution (DE) algorithm in both principle and practice and is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.
Abstract: Problems demanding globally optimal solutions are ubiquitous, yet many are intractable when they involve constrained functions having many local optima and interacting, mixed-type variables.The differential evolution (DE) algorithm is a practical approach to global numerical optimization which is easy to understand, simple to implement, reliable, and fast. Packed with illustrations, computer code, new insights, and practical advice, this volume explores DE in both principle and practice. It is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.

5,607 citations

Book
25 Nov 2014
TL;DR: The differential evolution (DE) algorithm is a practical approach to global numerical optimization which is easy to understand, simple to implement, reliable, and fast as discussed by the authors, which is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimisation.
Abstract: Problems demanding globally optimal solutions are ubiquitous, yet many are intractable when they involve constrained functions having many local optima and interacting, mixed-type variables.The differential evolution (DE) algorithm is a practical approach to global numerical optimization which is easy to understand, simple to implement, reliable, and fast. Packed with illustrations, computer code, new insights, and practical advice, this volume explores DE in both principle and practice. It is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.

4,273 citations

Journal ArticleDOI
TL;DR: This paper proposes a self- Adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions.
Abstract: Differential evolution (DE) is an efficient and powerful population-based stochastic search technique for solving optimization problems over continuous space, which has been widely applied in many scientific and engineering fields. However, the success of DE in solving a specific problem crucially depends on appropriately choosing trial vector generation strategies and their associated control parameter values. Employing a trial-and-error scheme to search for the most suitable strategy and its associated parameter settings requires high computational costs. Moreover, at different stages of evolution, different strategies coupled with different parameter settings may be required in order to achieve the best performance. In this paper, we propose a self-adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions. Consequently, a more suitable generation strategy along with its parameter settings can be determined adaptively to match different phases of the search process/evolution. The performance of the SaDE algorithm is extensively evaluated (using codes available from P. N. Suganthan) on a suite of 26 bound-constrained numerical optimization problems and compares favorably with the conventional DE and several state-of-the-art parameter adaptive DE variants.

3,085 citations