scispace - formally typeset
Search or ask a question
Author

R. Storn

Bio: R. Storn is an academic researcher. The author has contributed to research in topics: Global optimization & Simple (abstract algebra). The author has an hindex of 1, co-authored 1 publications receiving 2821 citations.

Papers
More filters

Cited by
More filters
Journal ArticleDOI
TL;DR: A detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far are presented.
Abstract: Differential evolution (DE) is arguably one of the most powerful stochastic real-parameter optimization algorithms in current use. DE operates through similar computational steps as employed by a standard evolutionary algorithm (EA). However, unlike traditional EAs, the DE-variants perturb the current-generation population members with the scaled differences of randomly selected and distinct population members. Therefore, no separate probability distribution has to be used for generating the offspring. Since its inception in 1995, DE has drawn the attention of many researchers all over the world resulting in a lot of variants of the basic algorithm with improved performance. This paper presents a detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far. Also, it provides an overview of the significant engineering applications that have benefited from the powerful nature of DE.

4,321 citations

Journal ArticleDOI
TL;DR: This paper proposes a self- Adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions.
Abstract: Differential evolution (DE) is an efficient and powerful population-based stochastic search technique for solving optimization problems over continuous space, which has been widely applied in many scientific and engineering fields. However, the success of DE in solving a specific problem crucially depends on appropriately choosing trial vector generation strategies and their associated control parameter values. Employing a trial-and-error scheme to search for the most suitable strategy and its associated parameter settings requires high computational costs. Moreover, at different stages of evolution, different strategies coupled with different parameter settings may be required in order to achieve the best performance. In this paper, we propose a self-adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions. Consequently, a more suitable generation strategy along with its parameter settings can be determined adaptively to match different phases of the search process/evolution. The performance of the SaDE algorithm is extensively evaluated (using codes available from P. N. Suganthan) on a suite of 26 bound-constrained numerical optimization problems and compares favorably with the conventional DE and several state-of-the-art parameter adaptive DE variants.

3,085 citations

Journal ArticleDOI
TL;DR: Results show that the performance of the ABC is better than or similar to those of other population-based algorithms with the advantage of employing fewer control parameters.

2,835 citations

Journal ArticleDOI
TL;DR: This paper presents a novel algorithm to accelerate the differential evolution (DE), which employs opposition-based learning (OBL) for population initialization and also for generation jumping and results confirm that the ODE outperforms the original DE and FADE in terms of convergence speed and solution accuracy.
Abstract: Evolutionary algorithms (EAs) are well-known optimization approaches to deal with nonlinear and complex problems. However, these population-based algorithms are computationally expensive due to the slow nature of the evolutionary process. This paper presents a novel algorithm to accelerate the differential evolution (DE). The proposed opposition-based DE (ODE) employs opposition-based learning (OBL) for population initialization and also for generation jumping. In this work, opposite numbers have been utilized to improve the convergence rate of DE. A comprehensive set of 58 complex benchmark functions including a wide range of dimensions is employed for experimental verification. The influence of dimensionality, population size, jumping rate, and various mutation strategies are also investigated. Additionally, the contribution of opposite numbers is empirically verified. We also provide a comparison of ODE to fuzzy adaptive DE (FADE). Experimental results confirm that the ODE outperforms the original DE and FADE in terms of convergence speed and solution accuracy.

1,419 citations

Journal ArticleDOI
TL;DR: It is found that it is a high time to provide a critical review of the latest literatures published and also to point out some important future avenues of research on DE.
Abstract: Differential Evolution (DE) is arguably one of the most powerful and versatile evolutionary optimizers for the continuous parameter spaces in recent times. Almost 5 years have passed since the first comprehensive survey article was published on DE by Das and Suganthan in 2011. Several developments have been reported on various aspects of the algorithm in these 5 years and the research on and with DE have now reached an impressive state. Considering the huge progress of research with DE and its applications in diverse domains of science and technology, we find that it is a high time to provide a critical review of the latest literatures published and also to point out some important future avenues of research. The purpose of this paper is to summarize and organize the information on these current developments on DE. Beginning with a comprehensive foundation of the basic DE family of algorithms, we proceed through the recent proposals on parameter adaptation of DE, DE-based single-objective global optimizers, DE adopted for various optimization scenarios including constrained, large-scale, multi-objective, multi-modal and dynamic optimization, hybridization of DE with other optimizers, and also the multi-faceted literature on applications of DE. The paper also presents a dozen of interesting open problems and future research issues on DE.

1,265 citations