scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A memory based differential evolution algorithm for unconstrained optimization

01 Jan 2016-Vol. 38, pp 501-517
TL;DR: Numerical, statistical, and graphical analysis reveals the competency of the proposed MBDE, which is employed to solve 12 basic, 25 CEC 2005, and 30 CEC 2014 unconstrained benchmark functions.
Abstract: This is a Flowchart of MBDE algorithm. A novel "Memory Based DE" algorithm proposed for unconstrained optimization.The algorithm relies on "swarm mutation" and "swarm crossover".Its robustness increased vastly with the help of the "Use of memory" mechanism.It obtains competitive performance with state-of-the-art methods.It has better convergence rate and better efficiency. In optimization, the performance of differential evolution (DE) and their hybrid versions exist in the literature is highly affected by the inappropriate choice of its operators like mutation and crossover. In general practice, during simulation DE does not employ any strategy of memorizing the so-far-best results obtained in the initial part of the previous generation. In this paper, a new "Memory based DE (MBDE)" presented where two "swarm operators" have been introduced. These operators based on the pBEST and gBEST mechanism of particle swarm optimization. The proposed MBDE is employed to solve 12 basic, 25 CEC 2005, and 30 CEC 2014 unconstrained benchmark functions. In order to further test its efficacy, five different test system of model order reduction (MOR) problem for single-input and single-output system are solved by MBDE. The results of MBDE are compared with state-of-the-art algorithms that also solved those problems. Numerical, statistical, and graphical analysis reveals the competency of the proposed MBDE.
Citations
More filters
Journal ArticleDOI
TL;DR: A well-defined “generation rate” term is proved to invigorate EO’s ability in exploration, exploitation, and local minima avoidance, and its performance is statistically similar to SHADE and LSHADE-SPACMA.
Abstract: This paper presents a novel, optimization algorithm called Equilibrium Optimizer (EO), inspired by control volume mass balance models used to estimate both dynamic and equilibrium states. In EO, each particle (solution) with its concentration (position) acts as a search agent. The search agents randomly update their concentration with respect to best-so-far solutions, namely equilibrium candidates, to finally reach to the equilibrium state (optimal result). A well-defined “generation rate” term is proved to invigorate EO’s ability in exploration, exploitation, and local minima avoidance. The proposed algorithm is benchmarked with 58 unimodal, multimodal, and composition functions and three engineering application problems. Results of EO are compared to three categories of existing optimization methods, including: (i) the most well-known meta-heuristics, including Genetic Algorithm (GA), Particle Swarm Optimization (PSO); (ii) recently developed algorithms, including Grey Wolf Optimizer (GWO), Gravitational Search Algorithm (GSA), and Salp Swarm Algorithm (SSA); and (iii) high performance optimizers, including CMA-ES, SHADE, and LSHADE-SPACMA. Using average rank of Friedman test, for all 58 mathematical functions EO is able to outperform PSO, GWO, GA, GSA, SSA, and CMA-ES by 60%, 69%, 94%, 96%, 77%, and 64%, respectively, while it is outperformed by SHADE and LSHADE-SPACMA by 24% and 27%, respectively. The Bonferroni–Dunnand Holm’s tests for all functions showed that EO is significantly a better algorithm than PSO, GWO, GA, GSA, SSA and CMA-ES while its performance is statistically similar to SHADE and LSHADE-SPACMA. The source code of EO is publicly availabe at https://github.com/afshinfaramarzi/Equilibrium-Optimizer , http://built-envi.com/portfolio/equilibrium-optimizer/ and http://www.alimirjalili.com/SourceCodes/EOcode.zip .

1,085 citations


Cites background from "A memory based differential evoluti..."

  • ...This mechanism aids in exploitation capability but can increase the chance of getting trapped in local minima if the method does not benefit from global exploration ability [23]....

    [...]

Journal ArticleDOI
TL;DR: The statistical post hoc analysis revealed that MPA can be nominated as a high-performance optimizer and is a significantly superior algorithm than GA, PSO, GSA, CS, SSA and CMA-ES while its performance is statistically similar to SHADE and LSHADE-cnEpSin.
Abstract: This paper presents a nature-inspired metaheuristic called Marine Predators Algorithm (MPA) and its application in engineering. The main inspiration of MPA is the widespread foraging strategy namely Levy and Brownian movements in ocean predators along with optimal encounter rate policy in biological interaction between predator and prey. MPA follows the rules that naturally govern in optimal foraging strategy and encounters rate policy between predator and prey in marine ecosystems. This paper evaluates the MPA's performance on twenty-nine test functions, test suite of CEC-BC-2017, randomly generated landscape, three engineering benchmarks, and two real-world engineering design problems in the areas of ventilation and building energy performance. MPA is compared with three classes of existing optimization methods, including (1) GA and PSO as the most well-studied metaheuristics, (2) GSA, CS and SSA as almost recently developed algorithms and (3) CMA-ES, SHADE and LSHADE-cnEpSin as high performance optimizers and winners of IEEE CEC competition. Among all methods, MPA gained the second rank and demonstrated very competitive results compared to LSHADE-cnEpSin as the best performing method and one of the winners of CEC 2017 competition. The statistical post hoc analysis revealed that MPA can be nominated as a high-performance optimizer and is a significantly superior algorithm than GA, PSO, GSA, CS, SSA and CMA-ES while its performance is statistically similar to SHADE and LSHADE-cnEpSin. The source code is publicly available at: https://github.com/afshinfaramarzi/Marine-Predators-Algorithm, http://built-envi.com/portfolio/marine-predators-algorithm/, https://www.mathworks.com/matlabcentral/fileexchange/74578-marine-predators-algorithm-mpa, and http://www.alimirjalili.com/MPA.html.

863 citations

Journal ArticleDOI
TL;DR: The journey of Differential Evolution is shown through its basic aspects like population generation, mutation schemes, crossover schemes, variation in parameters and hybridized variants along with various successful applications of DE.

316 citations

Proceedings ArticleDOI
05 Jun 2017
TL;DR: jSO is an improved variant of the iL-SHADE algorithm, mainly with a new weighted version of mutation strategy, and obtained the best final score among these three algorithms using the CEC 2017 evaluation method.
Abstract: Solving single objective real-parameter optimization problems, also known as a bound-constrained optimization, is still a challenging task. We can find such problems in engineering optimization, scientific applications, and in other real-world problems. Usually, these problems are very complex and computationally expensive. A new algorithm, called jSO, is presented in this paper. The algorithm is an improved variant of the iL-SHADE algorithm, mainly with a new weighted version of mutation strategy. The experiments were performed on CEC 2017 benchmark functions, which are different from previous competition benchmark functions. A comparison of the proposed jSO algorithm and the L-SHADE algorithm is presented first. From the obtained results we can conclude that jSO performs better in comparison with the L-SHADE algorithm. Next, a comparison of jSO and iL-SHADE is also performed, and jSO obtained better or competitive results. Using the CEC 2017 evaluation method, jSO obtained the best final score among these three algorithms.

309 citations


Cites background from "A memory based differential evoluti..."

  • ...We can find out that DE shows excellent performance and it is applied in many applications and researches [8], [9], [10], [11], [12], [13]....

    [...]

Journal ArticleDOI
TL;DR: The main focus of this paper is on the family of evolutionary algorithms and their real-life applications, and each technique is presented in the pseudo-code form, which can be used for its easy implementation in any programming language.
Abstract: The main focus of this paper is on the family of evolutionary algorithms and their real-life applications. We present the following algorithms: genetic algorithms, genetic programming, differential evolution, evolution strategies, and evolutionary programming. Each technique is presented in the pseudo-code form, which can be used for its easy implementation in any programming language. We present the main properties of each algorithm described in this paper. We also show many state-of-the-art practical applications and modifications of the early evolutionary methods. The open research issues are indicated for the family of evolutionary algorithms.

207 citations

References
More filters
Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations


"A memory based differential evoluti..." refers background in this paper

  • ...In spite of quite a good number of DE variants exist in the literature; DE further yields improved results while hybridizing with Particle Swarm Optimization (PSO) [20]....

    [...]

Journal ArticleDOI
Rainer Storn1, Kenneth Price
TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Abstract: A new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented. By means of an extensive testbed it is demonstrated that the new method converges faster and with more certainty than many other acclaimed global optimization methods. The new method requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.

24,053 citations

Journal ArticleDOI
TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Abstract: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving. A number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class. These theorems result in a geometric interpretation of what it means for an algorithm to be well suited to an optimization problem. Applications of the NFL theorems to information-theoretic aspects of optimization and benchmark measures of performance are also presented. Other issues addressed include time-varying optimization problems and a priori "head-to-head" minimax distinctions between optimization algorithms, distinctions that result despite the NFL theorems' enforcing of a type of uniformity over all algorithms.

10,771 citations


"A memory based differential evoluti..." refers background in this paper

  • ...Unfortunately, according to ‘No Free Lunch Theorem [19])’, no single optimization method exist which is able to solve consistently to all global optimization problems....

    [...]

  • ...[19] D.H. Wolpert, W.G. Macready, No Free Lunch Theorems for Optimization, IEEE Transactions on Evolutionary Computation 1 (1) (1997) 67-82....

    [...]

Book
13 Dec 2005
TL;DR: This volume explores the differential evolution (DE) algorithm in both principle and practice and is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.
Abstract: Problems demanding globally optimal solutions are ubiquitous, yet many are intractable when they involve constrained functions having many local optima and interacting, mixed-type variables.The differential evolution (DE) algorithm is a practical approach to global numerical optimization which is easy to understand, simple to implement, reliable, and fast. Packed with illustrations, computer code, new insights, and practical advice, this volume explores DE in both principle and practice. It is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.

5,607 citations


"A memory based differential evoluti..." refers background in this paper

  • ...As other EAs it does not guarantee to find a global optimal solution in a finite time interval [3]....

    [...]

  • ...Therefore, in order to improve the performance of basic DE, a number of attempts are made in the literature [3-16]....

    [...]

  • ...It loses maintaining the diversity in the population during simulation [3]....

    [...]

  • ...The space complexity of DE is also low as compared to some of the most competitive real parameter optimizers [3]....

    [...]

  • ...So far, DE has received extensive attention and applied to many engineering optimization problems such as mechanical engineering design problem [6], fuzzy clustering of image pixel [7], economic load dispatch [8] and many others [3]....

    [...]

Journal ArticleDOI
TL;DR: A detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far are presented.
Abstract: Differential evolution (DE) is arguably one of the most powerful stochastic real-parameter optimization algorithms in current use. DE operates through similar computational steps as employed by a standard evolutionary algorithm (EA). However, unlike traditional EAs, the DE-variants perturb the current-generation population members with the scaled differences of randomly selected and distinct population members. Therefore, no separate probability distribution has to be used for generating the offspring. Since its inception in 1995, DE has drawn the attention of many researchers all over the world resulting in a lot of variants of the basic algorithm with improved performance. This paper presents a detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far. Also, it provides an overview of the significant engineering applications that have benefited from the powerful nature of DE.

4,321 citations


"A memory based differential evoluti..." refers background or result in this paper

  • ...A detailed survey on the variants of DE can be found in [4, 5]....

    [...]

  • ...Therefore, in order to improve the performance of basic DE, a number of attempts are made in the literature [3-16]....

    [...]

  • ...As evidenced by the recent studies on DE [4, 5], it exhibits much better performance in comparison with several other EAs....

    [...]