scispace - formally typeset
Search or ask a question
Author

M. Montaz Ali

Bio: M. Montaz Ali is an academic researcher from University of the Witwatersrand. The author has contributed to research in topics: Global optimization & Differential evolution. The author has an hindex of 26, co-authored 101 publications receiving 3093 citations. Previous affiliations of M. Montaz Ali include Loughborough University & Rio de Janeiro State University.


Papers
More filters
Journal ArticleDOI
TL;DR: A collection of test problems, some are better known than others, provides an easily accessible collection of standard test problems for continuous global optimization and investigates the microscopic behavior of the algorithms through quartile sequential plots.
Abstract: There is a need for a methodology to fairly compare and present evaluation study results of stochastic global optimization algorithms. This need raises two important questions of (i) an appropriate set of benchmark test problems that the algorithms may be tested upon and (ii) a methodology to compactly and completely present the results. To address the first question, we compiled a collection of test problems, some are better known than others. Although the compilation is not exhaustive, it provides an easily accessible collection of standard test problems for continuous global optimization. Five different stochastic global optimization algorithms have been tested on these problems and a performance profile plot based on the improvement of objective function values is constructed to investigate the macroscopic behavior of the algorithms. The paper also investigates the microscopic behavior of the algorithms through quartile sequential plots, and contrasts the information gained from these two kinds of plots. The effect of the length of run is explored by using three maximum numbers of function evaluations and it is shown to significantly impact the behavior of the algorithms.

545 citations

Journal ArticleDOI
TL;DR: This paper studies the efficiency and robustness of some recent and well known population set-based direct search global optimization methods such as Controlled Random Search, Differential Evolution and the Genetic Algorithm.

401 citations

Journal ArticleDOI
TL;DR: Numerical experiments indicate that the resulting algorithms are considerably better than the original differential evolution algorithm, and offer a reasonable alternative to many currently available stochastic algorithms, especially for problems requiring ‘direct search type’ methods.

273 citations

Journal ArticleDOI
TL;DR: Modifications to the controlled random search (CRS) algorithm are suggested, in particular, point generation schemes using linear interpolation and mutation are introduced, which offer a reasonable alternative to many currently available stochastic algorithms, especially for problems requiring direct search type methods.
Abstract: We suggested some modifications to the controlled random search (CRS) algorithm for global optimization We introduce new trial point generation schemes in CRS, in particular, point generation schemes using linear interpolation and mutation Central to our modifications is the probabilistic adaptation of point generation schemes within the CRS algorithm A numerical study is carried out using a set of 50 test problems many of which are inspired by practical applications Numerical experiments indicate that the resulting algorithms are considerably better than the previous versions Thus, they offer a reasonable alternative to many currently available stochastic algorithms, especially for problems requiring direct search type methods

160 citations

Journal ArticleDOI
TL;DR: A classification of essentially unconstrained global optimization problems into unimodal, easy, moderately difficult, and difficult problems is proposed to remedy the lack of a representative set of test problems for comparing global optimization methods.
Abstract: There is a lack of a representative set of test problems for comparing global optimization methods To remedy this a classification of essentially unconstrained global optimization problems into unimodal, easy, moderately difficult, and difficult problems is proposed The problem features giving this classification are the chance to miss the region of attraction of the global minimum, embeddedness of the global minimum, and the number of minimizers The classification of some often used test problems are given and it is recognized that most of them are easy and some even unimodal Global optimization solution techniques treated are global, local, and adaptive search and their use for tackling different classes of problems is discussed The problem of fair comparison of methods is then adressed Further possible components of a general global optimization tool based on the problem classes and solution techniques is presented

125 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far are presented.
Abstract: Differential evolution (DE) is arguably one of the most powerful stochastic real-parameter optimization algorithms in current use. DE operates through similar computational steps as employed by a standard evolutionary algorithm (EA). However, unlike traditional EAs, the DE-variants perturb the current-generation population members with the scaled differences of randomly selected and distinct population members. Therefore, no separate probability distribution has to be used for generating the offspring. Since its inception in 1995, DE has drawn the attention of many researchers all over the world resulting in a lot of variants of the basic algorithm with improved performance. This paper presents a detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far. Also, it provides an overview of the significant engineering applications that have benefited from the powerful nature of DE.

4,321 citations

Journal ArticleDOI
TL;DR: This paper proposes a self- Adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions.
Abstract: Differential evolution (DE) is an efficient and powerful population-based stochastic search technique for solving optimization problems over continuous space, which has been widely applied in many scientific and engineering fields. However, the success of DE in solving a specific problem crucially depends on appropriately choosing trial vector generation strategies and their associated control parameter values. Employing a trial-and-error scheme to search for the most suitable strategy and its associated parameter settings requires high computational costs. Moreover, at different stages of evolution, different strategies coupled with different parameter settings may be required in order to achieve the best performance. In this paper, we propose a self-adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions. Consequently, a more suitable generation strategy along with its parameter settings can be determined adaptively to match different phases of the search process/evolution. The performance of the SaDE algorithm is extensively evaluated (using codes available from P. N. Suganthan) on a suite of 26 bound-constrained numerical optimization problems and compares favorably with the conventional DE and several state-of-the-art parameter adaptive DE variants.

3,085 citations

Journal ArticleDOI
TL;DR: The results show that the algorithm with self-adaptive control parameter settings is better than, or at least comparable to, the standard DE algorithm and evolutionary algorithms from literature when considering the quality of the solutions obtained.
Abstract: We describe an efficient technique for adapting control parameter settings associated with differential evolution (DE). The DE algorithm has been used in many practical cases and has demonstrated good convergence properties. It has only a few control parameters, which are kept fixed throughout the entire evolutionary process. However, it is not an easy task to properly set control parameters in DE. We present an algorithm-a new version of the DE algorithm-for obtaining self-adaptive control parameter settings that show good performance on numerical benchmark problems. The results show that our algorithm with self-adaptive control parameter settings is better than, or at least comparable to, the standard DE algorithm and evolutionary algorithms from literature when considering the quality of the solutions obtained

2,820 citations

Book
22 Jun 2009
TL;DR: This book provides a complete background on metaheuristics and shows readers how to design and implement efficient algorithms to solve complex optimization problems across a diverse range of applications, from networking and bioinformatics to engineering design, routing, and scheduling.
Abstract: A unified view of metaheuristics This book provides a complete background on metaheuristics and shows readers how to design and implement efficient algorithms to solve complex optimization problems across a diverse range of applications, from networking and bioinformatics to engineering design, routing, and scheduling. It presents the main design questions for all families of metaheuristics and clearly illustrates how to implement the algorithms under a software framework to reuse both the design and code. Throughout the book, the key search components of metaheuristics are considered as a toolbox for: Designing efficient metaheuristics (e.g. local search, tabu search, simulated annealing, evolutionary algorithms, particle swarm optimization, scatter search, ant colonies, bee colonies, artificial immune systems) for optimization problems Designing efficient metaheuristics for multi-objective optimization problems Designing hybrid, parallel, and distributed metaheuristics Implementing metaheuristics on sequential and parallel machines Using many case studies and treating design and implementation independently, this book gives readers the skills necessary to solve large-scale optimization problems quickly and efficiently. It is a valuable reference for practicing engineers and researchers from diverse areas dealing with optimization or machine learning; and graduate students in computer science, operations research, control, engineering, business and management, and applied mathematics.

2,735 citations