scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Bat algorithm: a novel approach for global engineering optimization

13 Jul 2012-Engineering Computations (Emerald Group Publishing Limited)-Vol. 29, Iss: 5, pp 464-483
TL;DR: A new nature‐inspired metaheuristic optimization algorithm, called bat algorithm (BA), based on the echolocation behavior of bats is introduced, and the optimal solutions obtained are better than the best solutions obtained by the existing methods.
Abstract: – Nature‐inspired algorithms are among the most powerful algorithms for optimization. The purpose of this paper is to introduce a new nature‐inspired metaheuristic optimization algorithm, called bat algorithm (BA), for solving engineering optimization tasks., – The proposed BA is based on the echolocation behavior of bats. After a detailed formulation and explanation of its implementation, BA is verified using eight nonlinear engineering optimization problems reported in the specialized literature., – BA has been carefully implemented and carried out optimization for eight well‐known optimization tasks; then a comparison has been made between the proposed algorithm and other existing algorithms., – The optimal solutions obtained by the proposed algorithm are better than the best solutions obtained by the existing methods. The unique search features used in BA are analyzed, and their implications for future research are also discussed in detail.
Citations
More filters
Journal ArticleDOI
TL;DR: The statistical results and comparisons show that the HHO algorithm provides very promising and occasionally competitive results compared to well-established metaheuristic techniques.

2,871 citations


Cites methods from "Bat algorithm: a novel approach for..."

  • ...224 The results and performance of the proposed HHO is compared with other well-established 225 optimization techniques such as the GA [22], BBO [22], DE [22], PSO [22], CS [34], TLBO [29], 226 BA/BAT [52], FPA [53], FA [54], GWO [55], and MFO [56] algorithms based on the best, worst, 227 standard deviation (STD) and average of the results (AVG)....

    [...]

  • ...Details222 of the CM test problems are also reported in Table 19 in Appendix A. Figure 8 demonstrates three223 of composition test problems.224 The results and performance of the proposed HHO is compared with other well-established225 optimization techniques such as the GA [22], BBO [22], DE [22], PSO [22], CS [34], TLBO [29],226 BA/BAT [52], FPA [53], FA [54], GWO [55], and MFO [56] algorithms based on the best, worst,227 standard deviation (STD) and average of the results (AVG)....

    [...]

  • ...238 The settings of GA, PSO, DE and BBO algorithms are same with those set by Dan Simon in 239 the original work of BBO [22], while for the BA [52], FA [58], TLBO [29], GWO [55], FPA [53], 240 CS [34], and MFO [56], the parameters are same with the recommended settings in the original 241 works....

    [...]

Journal ArticleDOI
TL;DR: The proposed KH algorithm, based on the simulation of the herding behavior of krill individuals, is capable of efficiently solving a wide range of benchmark optimization problems and outperforms the exciting algorithms.

1,556 citations


Cites background from "Bat algorithm: a novel approach for..."

  • ...Several extensions to the major categories of the swarm algorithms have been presented in the literature [18]....

    [...]

Journal ArticleDOI
TL;DR: Experimental results show that the AOA provides very promising results in solving challenging optimization problems compared with eleven other well-known optimization algorithms.

1,218 citations


Cites background from "Bat algorithm: a novel approach for..."

  • ...Table 16 gives the Wilcoxon signed-rank test results with a significance level at α = 0.05 among seven comparative algorithms (GWO, BAT, FA, CS, MFO, GSA, and DE) for twenty-nine test functions (F1–F29)....

    [...]

  • ...Moreover, Friedman ranking test has also been applied for these results; the proposed AOA got the first ranking compared to other comparative methods followed by DE, CS, TLBO, FA, MFO, GWO, FPA, BBO, PSO, GA, and BAT....

    [...]

  • ...The results are compared with the following algorithms: • Genetic Algorithm (GA) [32] • Particle Swarm Optimization (PSO) [33] • Biogeography-based Optimization (BBO) [34] • Flower Pollination Algorithm (FPA) [35] • Grey Wolf Optimizer (GWO) [36] • Bat Algorithm (BAT) [37] • Firefly Algorithm (FA) [38] • Cuckoo Search Algorithm (CS) [39] • Moth-Flame Optimization (MFO) [40] • Gravitational Search Algorithm (GSA) [25] • Differential Evolution (DE) [16]....

    [...]

  • ...11 that the proposed AOA has a steady convergence and a slow convergence acceleration on these test functions compared with other comparative algorithms (GA, FPA, BBO, BAT, PSO, and GWO)....

    [...]

  • ...The obtained results show that the proposed AOA is ranked first compared to other comparative algorithms, followed by GWO is ranked second, CS is ranked third, FA is ranked fourth, GSA is ranked fifth, BBO is ranked sixth, FPA is ranked seventh, GA is ranked eighth, DE is ranked ninth, MSO is ranked tenth, PSO is ranked eleventh, and finally BAT is ranked twelfth....

    [...]

Book
17 Feb 2014
TL;DR: This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences, and researchers and engineers as well as experienced experts will also find it a handy reference.
Abstract: Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning and control, as well as multi-objective optimization. This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences. It can also serve a source of inspiration for new applications. Researchers and engineers as well as experienced experts will also find it a handy reference.Discusses and summarizes the latest developments in nature-inspired algorithms with comprehensive, timely literatureProvides a theoretical understanding as well as practical implementation hintsProvides a step-by-step introduction to each algorithm

901 citations

Journal ArticleDOI
TL;DR: A timely review of the bat algorithm and its new variants and a wide range of diverse applications and case studies are reviewed and summarised briefly here.
Abstract: Bat algorithm BA is a bio-inspired algorithm developed by Xin-She Yang in 2010 and BA has been found to be very efficient. As a result, the literature has expanded significantly in the last three years. This paper provides a timely review of the bat algorithm and its new variants. A wide range of diverse applications and case studies are also reviewed and summarised briefly here. In addition, we also discuss the essence of an algorithm and the links between algorithms and self-organisation. Further research topics are also discussed.

791 citations


Cites background from "Bat algorithm: a novel approach for..."

  • ...…continuous optimization in the context of engineering design optimization has been extensively studied, which demonstrated that BA can deal with highly nonlinear problem efficiently and can find the optimal solutions accurately (Yang, 2010; Yang and Gandomi, 2012; Yang, 2012; Yang et al., 2012a)....

    [...]

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations

Proceedings ArticleDOI
04 May 1998
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through "genetic" operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are "evolved" by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer.

9,373 citations

Journal ArticleDOI
01 Feb 2001
TL;DR: A new heuristic algorithm, mimicking the improvisation of music players, has been developed and named Harmony Search (HS), which is illustrated with a traveling salesman problem (TSP), a specific academic optimization problem, and a least-cost pipe network design problem.
Abstract: Many optimization problems in various fields have been solved using diverse optimization al gorithms. Traditional optimization techniques such as linear programming (LP), non-linear programming (NL...

5,136 citations

Proceedings ArticleDOI
05 Jul 1995
TL;DR: C Culling is near optimal for this problem, highly noise tolerant, and the best known a~~roach in some regimes, and some new large deviation bounds on this submartingale enable us to determine the running time of the algorithm.
Abstract: We analyze the performance of a Genetic Type Algorithm we call Culling and a variety of other algorithms on a problem we refer to as ASP. Culling is near optimal for this problem, highly noise tolerant, and the best known a~~roach . . in some regimes. We show that the problem of learning the Ising perception is reducible to noisy ASP. These results provide an example of a rigorous analysis of GA’s and give insight into when and how C,A’s can beat competing methods. To analyze the genetic algorithm, we view it as a special type of submartingale. We prove some new large deviation bounds on this submartingale w~ich enable us to determine the running time of the algorithm.

4,520 citations