Journal ArticleDOI
Group teaching optimization algorithm: A novel metaheuristic method for solving global optimization problems
Yiying Zhang,Zhigang Jin +1 more
TLDR
A new metaheuristic algorithm called group teaching optimization algorithm (GTOA) is presented, inspired by group teaching mechanism, which needs only the essential population size and stopping criterion without extra control parameters and has great potential to be used widely.Abstract:
In last 30 years, many metaheuristic algorithms have been developed to solve optimization problems. However, most existing metaheuristic algorithms have extra control parameters except the essential population size and stopping criterion. Considering different characteristics of different optimization problems, how to adjust these extra control parameters is a great challenge for these algorithms in solving different optimization problems. In order to address this challenge, a new metaheuristic algorithm called group teaching optimization algorithm (GTOA) is presented in this paper. The proposed GTOA is inspired by group teaching mechanism. To adapt group teaching to be suitable for using as an optimization technique, without loss of generality, four simple rules are first defined. Then a group teaching model is built under the guide of the four rules, which consists of teacher allocation phase, ability grouping phase, teacher phase and student phase. Note that GTOA needs only the essential population size and stopping criterion without extra control parameters, which has great potential to be used widely. GTOA is first examined over 28 well-known unconstrained benchmark problems and the optimization results are compared with nine state-of-the-art algorithms. Experimental results show the superior performance of the proposed GTOA for these problems in terms of solution quality, convergence speed and stability. Furthermore, GTOA is used to solve four constrained engineering design optimization problems in the real world. Simulation results demonstrate the proposed GTOA can find better solutions with faster speed compared with the reported optimizers.read more
Citations
More filters
Journal ArticleDOI
Dwarf Mongoose Optimization Algorithm
TL;DR: In this article , a metaheuristic algorithm called dwarf mongoose optimization algorithm (DMO) is proposed to solve the classical and CEC 2020 benchmark functions and 12 continuous/discrete engineering optimization problems.
Journal ArticleDOI
Golden eagle optimizer: A nature-inspired metaheuristic algorithm
TL;DR: A nature-inspired swarm-based metaheuristic for solving global optimization problems called Golden Eagle Optimizer (GEO), which shows GEO’s superiority, which indicates that it can find the global optimum and avoid local optima effectively.
Journal ArticleDOI
Comprehensive Taxonomies of Nature- and Bio-inspired Optimization: Inspiration Versus Algorithmic Behavior, Critical Analysis Recommendations
Daniel Molina,Javier Poyatos,Javier Del Ser,Javier Del Ser,Salvador García,Amir Hussain,Francisco Herrera,Francisco Herrera +7 more
TL;DR: In this paper, the authors present a taxonomy of nature-inspired and bio-inspired algorithms, and provide a critical summary of design trends and similarities between them, and the identification of the most similar classical algorithm for each reviewed paper.
Journal ArticleDOI
An adaptive regeneration framework based on search space adjustment for differential evolution
Gaoji Sun,Chunlei Li,Libao Deng +2 more
TL;DR: An adaptive regeneration framework based on search space adjustment (ARSA), which can be easily embedded into various DE variants, which notably improves the performance of two basic DE algorithms and six state-of-the-art DE variants.
References
More filters
Journal ArticleDOI
Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces
Rainer Storn,Kenneth Price +1 more
TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Journal ArticleDOI
No free lunch theorems for optimization
TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Journal ArticleDOI
Grey Wolf Optimizer
TL;DR: The results of the classical engineering design problems and real application prove that the proposed GWO algorithm is applicable to challenging problems with unknown search spaces.
Proceedings ArticleDOI
A modified particle swarm optimizer
Yuhui Shi,Russell C. Eberhart +1 more
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Journal ArticleDOI
The Whale Optimization Algorithm
Seyedali Mirjalili,Andrew Lewis +1 more
TL;DR: Optimization results prove that the WOA algorithm is very competitive compared to the state-of-art meta-heuristic algorithms as well as conventional methods.