scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Focusing on the Golden Ball Metaheuristic: An Extended Study on a Wider Set of Problems

03 Aug 2014-The Scientific World Journal (Hindawi Publishing Corporation)-Vol. 2014, pp 563259-563259
TL;DR: The Golden Ball is applied to four different combinatorial optimization problems, two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls.
Abstract: Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results.
Citations
More filters
Journal ArticleDOI
TL;DR: A survey of metaheuristic research in literature consisting of 1222 publications from year 1983 to 2016 is performed to highlight potential open questions and critical issues raised in literature and provides guidance for future research to be conducted more meaningfully.
Abstract: Because of successful implementations and high intensity, metaheuristic research has been extensively reported in literature, which covers algorithms, applications, comparisons, and analysis. Though, little has been evidenced on insightful analysis of metaheuristic performance issues, and it is still a “black box” that why certain metaheuristics perform better on specific optimization problems and not as good on others. The performance related analyses performed on algorithms are mostly quantitative via performance validation metrics like mean error, standard deviation, and co-relations have been used. Moreover, the performance tests are often performed on specific benchmark functions—few studies are those which involve real data from scientific or engineering optimization problems. In order to draw a comprehensive picture of metaheuristic research, this paper performs a survey of metaheuristic research in literature which consists of 1222 publications from year 1983 to 2016 (33 years). Based on the collected evidence, this paper addresses four dimensions of metaheuristic research: introduction of new algorithms, modifications and hybrids, comparisons and analysis, and research gaps and future directions. The objective is to highlight potential open questions and critical issues raised in literature. The work provides guidance for future research to be conducted more meaningfully that can serve for the good of this area of research.

467 citations


Cites methods from "Focusing on the Golden Ball Metaheu..."

  • ...The researchers have also used metaphors from our daily life; such as, interior design (Gandomi 2014), sports (Osaba et al. 2014), music (Geem et al. 2001), and vocational skills (Qin 2009), etc. Interestingly, some of the metaphors have also been adopted from the disciplines that deal with how…...

    [...]

  • ...Table 4 Acronyms of metaheuristics abbreviations used in this study Abbreviations Acronyms AAA Alienated Ant Algorithm (Uymaz et al. 2015) AAA2 Artificial Algae Algorithm (Bandieramonte et al. 2010) ABC Artificial Bee Colony (Karaboga 2005) ABO African Buffalo Optimization (Odili and Mohmad Kahar 2016) ACO Ant Colony Optimization (Dorigo et al. 2006) ACS Ant Colony System (Dorigo and Gambardella 1997) ADS Adaptive Dimensional Search (Hasançebi and Azad 2015) AE Adaptive Evolution (Viveros Jiménez et al. 2009) AFSA Artificial Fish Swarm Algorithm (Huang and Chen 2015) ANS Across Neighborhood Search (Wu 2016) AntStar AntStar (Faisal et al. 2016) ARFO Artificial Root Foraging Algorithm (Ma et al. 2014) BA Bet Algorithm (Yang 2010) BB-BC Big BangBig Crunch (Erol and Eksin 2006) BBMO Bumble Bees Mating Optimization (Marinakis and Marinaki 2014) BBO Biogeography Based Optimization (Simon 2008) BCO Bacterial Colony Optimization (Niu and Wang 2012) BDO Bottlenose Dolphin Optimization (Kiruthiga et al. 2015) Beehive Beehive (Munoz et al. 2009) BFOA Bacterial Foraging Optimization Algorithm (Zhao and Wang 2016) BSA Backtracking Search Optimization Algorithm (Civicioglu 2013) BSO Bees Swarm Optimization (Djenouri et al. 2012) BSOA Brain Storm Optimization Algorithm (Shi 2011) CA Cultural Algorithm (Ali et al. 2016) CBM Coalition-Based Metaheuristic (Meignan et al. 2010) CBO Colliding Bodies Optimization (Kaveh and Mahdavi 2014) CFO Central Force Optimization (Liu and Tian 2015) CGO Contour Gradient Optimization (Wu et al. 2013) CGS Consultant-Guided Search (Iordache 2010) CPDE Cloud Particles Differential Evolution (Li et al. 2015) Table 4 continued Abbreviations Acronyms Cricket Cricket Algorithm (Canayaz and Karcı 2015) CRO Chemical Reaction Optimization (Li et al. 2015) CROA Coral Reefs Optimization Algorithm (Salcedo-Sanz et al. 2014) CrowSA Crow Search Algorithm (Askarzadeh 2016) CS Cuckoo Search (Yang and Deb 2014) CSO Chicken Swarm Optimization (Meng et al. 2014) CSOA Cat Swarm Optimization Algorithm (Crawford et al. 2015) CSS Charged System Search (Kaveh and Talatahari 2010) CyberSA Cyber Swarm Algorithm (Yin et al. 2010) DE Differential Evaluation (Storn and Price 1997) DEO Dolphin Echolocation Optimization (Kaveh and Farhoudi 2016) DS Differential Search (Sulaiman et al. 2014) EA Evolutionary Algorithm (Angeline et al. 1994) EBO Ecogeography-Based Optimization (Zhang et al. 2017) eBPA enhanced Best Performance Algorithm (Chetty and Adewumi 2015) EFO Electromagnetic Field Optimization (Abedinpourshotorban et al. 2016) EM Electromagnetism Metaheuristic (Filipović et al. 2013) EO Extremal Optimization (Chen et al. 2006) EP Evolutionary Programming (Yao et al. 1999) ES Evolution Strategies (Beyer and Schwefel 2002) ESA Elephant Search Algorithm (Deb et al. 2015) FA Firefly Algorithm (Yang 2008) FASO Foraging Agent Swarm Optimization (Barresi 2014) FEO Fish Electrolocation Optimization (Haldar and Chakraborty 2017) FFO Fruit Fly Optimization (Pan 2012) FPA Flower Pollination Algorithm (Wang and Zhou 2014) FWA Fireworks Algorithm (Tan and Zhu 2010) GA Genetic Algorithm (Holland 1992) GB Golden Ball (Osaba et al. 2014) GBMO Gases Brownian Motion Optimization (Abdechiri et al. 2013) GEA Gradient Evolution Algorithm (Kuo and Zulvia 2015) GGS Gradient Gravitational Search (Dash and Sahu 2015) GHOA Green Herons Optimization Algorithm (Sur and Shukla 2013) GRASP Greedy Randomized Adaptive Search Procedures (Feo and Resende 1989) GSA Gravitational Search Algorithm (Rashedi et al. 2009) GSO Glowworm Swarm Optimization (He et al. 2006) GSOA Galactic Swarm Optimization Algorithm (Muthiah-Nakarajan and Noel 2016) GWO Grey Wolf Optimizer (Li and Wang 2015) HBMO Honey Bees Mating Optimization (Marinakis and Marinaki 2011) HS Harmony Search (Geem et al. 2001) Table 4 continued Abbreviations Acronyms HSS Hyper-Spherical Search (Karami et al. 2014) IBA Improved Bees Algorithm (Sharma et al. 2015) ICA Imperialistic Competitive Algorithm (Kashani et al. 2016) ILS Iterative Local Search (Aarts and Lenstra 1997) ISA Interior Search Algorithm (Gandomi 2014) ISOA Importance Search Optimization Algorithm (Sun 2010) IWD Intelligent Water Drops (Shah-Hosseini 2008) IWO Invasive Weed Optimization (Karimkashi and Kishk 2010) JA Jaguar Algorithm (Chen et al. 2015) JOA Joint Operations Algorithm (Sun et al. 2016) KCA Key Cutting Algorithm (Qin 2009) KHA Krill Herd Algorithm (Amudhavel et al. 2015) LaF Leaders and followers (Gonzalez-Fernandez and Chen 2015) LASDA Adaptive Spiral Dynamics Algorithm (Nasir et al. 2016) LOA Lion Optimization Algorithm (Yazdani and Jolai 2016) LS Local Search (Aarts and Lenstra 1997) LSA Locust Swarm Algorithm (Cuevas et al. 2015) LSO Lifecycle-based Swarm Optimization (Shen et al. 2014) MBA Mine Blast Algorithm (Sadollah et al. 2013) MBO Marriage in honey Bees Optimization (Bandieramonte et al. 2010) MBOA Migrating Birds Optimization Algorithm (Duman et al. 2012) MCSS Magnetic Charged System Search (Kaveh et al. 2013) MFO Moth-Flame Optimization (Mirjalili 2015) MHSA Mosquito Host-Seeking Algorithm (Feng et al. 2009) Monkey Monkey Algorithm (Zhao and Tang 2008) ODMA Open Source Development Model Algorithm (Hajipour et al. 2016) Plant Plant (Caraveo et al. 2015) PSO Particle Swarm Optimization (Eberhart and Kennedy 1995) PVS Passing Vehicle Search (Savsani and Savsani 2016) RA Raindrop Algorithm (Wei 2013) RMO Radial Movement Optimization (Rahmani and Yusof 2014) RO Ray Optimization (Kaveh and Khayatazad 2012) RRA Runner-Root Algorithm (Merrikh-Bayat 2015) RROA Raven Roosting Optimisation Algorithm (Brabazon et al. 2016) SA Simulated Annealing (Kirkpatrick et al. 1983) SAC Simple Adaptive Climbing (Viveros-Jiménez et al. 2014) SCA Sine Cosine Algorithm (Mirjalili 2016) SCE Shuffled Complex Evolution (Duan et al. 1993) SDMSFA Smart Dispatching and Metaheuristic Swarm Flow Algorithm (Rodzin 2014) SDS Stochastic Diffusion Search (al Rifaie et al. 2011) Table 4 continued Abbreviations Acronyms SEOA Social Emotional Optimization Algorithm (Xu et al. 2010) SFLA Shuffled Frog Leaping Algorithm (Eusuff et al. 2006) SFS Stochastic Fractal Search (Salimi 2015) SGA Search Group Algorithm (Gonçalves et al. 2015) SSmell Shark Smell Optimization (Abedinia et al. 2014) SLO Seven-spot Ladybird Optimization (Wang et al. 2013) SMO Spider Monkey Optimization (Gupta and Deep 2016) SNSO Social Network-based Swarm Optimization (Liang et al. 2015) SOA Seeker Optimization Algorithm (Zhu et al. 2014) SOS Symbiotic Organism Search (Abdullahi et al. 2016) SS&PR Scatter Search and Path Relinking (Glover 1997) SSO Simplified Swarm Optimization (Yeh et al. 2015) SSOA Social Spider Optimization Algorithm (Cuevas et al. 2013) TLBO Teaching-Learning-Based Optimization (Rao et al. 2011) TS Tabu Search (Glover 1989) VCS Virus Colony Search (Li et al. 2016) VDSA Variable Depth Search Algorithm (Bouhmala 2015) VNS Variable Neighborhood Search (Mladenovi and Hansen 1997) VSA Vortex Search Algorithm (Doan and lmez 2015) WCA Water Cycle Algorithm (Sadollah et al. 2015) WDO Wind Driven Optimization (Bayraktar et al. 2010) WEO Water Evaporation Optimization (Kaveh and Bakhshpoori 2016) WFA Water Flow-like Algorithm (Yang and Wang 2007) WPA Wolf Pack Algorithm (Wu and Zhang 2014) WS Warping Search (Gonçalves et al. 2008) WSA Weighted Superposition Attraction (Baykasoğlu and Akpinar 2015) WWO Water Wave Optimization (Zheng 2015)...

    [...]

  • ...The researchers have also used metaphors from our daily life; such as, interior design (Gandomi 2014), sports (Osaba et al. 2014), music (Geem et al....

    [...]

  • ...2015) FA Firefly Algorithm (Yang 2008) FASO Foraging Agent Swarm Optimization (Barresi 2014) FEO Fish Electrolocation Optimization (Haldar and Chakraborty 2017) FFO Fruit Fly Optimization (Pan 2012) FPA Flower Pollination Algorithm (Wang and Zhou 2014) FWA Fireworks Algorithm (Tan and Zhu 2010) GA Genetic Algorithm (Holland 1992) GB Golden Ball (Osaba et al. 2014) GBMO Gases Brownian Motion Optimization (Abdechiri et al....

    [...]

  • ...…Flower Pollination Algorithm (Wang and Zhou 2014) FWA Fireworks Algorithm (Tan and Zhu 2010) GA Genetic Algorithm (Holland 1992) GB Golden Ball (Osaba et al. 2014) GBMO Gases Brownian Motion Optimization (Abdechiri et al. 2013) GEA Gradient Evolution Algorithm (Kuo and Zulvia 2015) GGS…...

    [...]

01 Jan 2016
TL;DR: The numerical optimization of computer models is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading numerical optimization of computer models. Maybe you have knowledge that, people have search hundreds times for their chosen readings like this numerical optimization of computer models, but end up in malicious downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some infectious bugs inside their computer. numerical optimization of computer models is available in our book collection an online access to it is set as public so you can get it instantly. Our book servers saves in multiple countries, allowing you to get the most less latency time to download any of our books like this one. Kindly say, the numerical optimization of computer models is universally compatible with any devices to read.

119 citations

Journal ArticleDOI
Bilal Alatas1
TL;DR: All of the computational intelligence algorithms based on sports and their applications have been for the first time searched and collected and performance comparison of these sports based algorithms and other popular algorithms such as Genetic Algorithm, Particle Swarm Optimization, and Differential Evolution within unconstrained global optimization benchmark problems with different characteristics has been performed.
Abstract: Many classical search and optimization algorithms are especially insufficient in solving very hard large scale nonlinear problems with stringent constraints. Hence, computational intelligence optimization algorithms have been proposed and used to find well-enough solutions at a reasonable computation time when the classical algorithms are not applicable or do not provide good solutions to these problems due to the unmanageable search space. Many existing algorithms are nature-inspired, which work by simulating or modeling different natural processes. Due to the philosophy of continually searching the best and absence of the most efficient method for all types of problems, novel algorithms or new variants of current algorithms are being proposed and seem to be proposed in future to see if they can cope with challenging optimization problems. Studies on sports in recent years have shown that processes, concepts, rules, and events in various sports can be considered and modelled as novel efficient search and optimization methods with effective exploration capabilities in many cases, which are able to outperform existing classical and computational intelligence based optimization methods within different types of search spaces (Kashan in Appl Soft Comput 16:171–200, 2014; Bouchekara in Oper Res 1–57, 2017; Razmjooy in J Control Autom Electr Syst 1–22, 2016; Osaba et al. in Appl Intell 41(1):145–166, 2014a, Sci World J, 2014b). These novel and interesting sports based algorithms have shown to be more effective and robust than alternative approaches in a large number of applications. In this work, all of the computational intelligence algorithms based on sports and their applications have been for the first time searched and collected. Specific modelling of real sport games for computational intelligence algorithms and their novelties in terms of comparison with alternative existing algorithms for optimization have been reviewed with specific characteristics, computational implementation details and main applications capabilities, in the frame of hard optimization problems. Information is given about these search and optimization algorithms such as League Championship Algorithm, Soccer League Optimization, Soccer Game Optimization, Soccer League Competition Algorithm, Golden Ball Algorithm, World Cup Optimization, Football Optimization Algorithm, Football Game Inspired Algorithm, and Most Valuable Player Algorithm. Performance comparison of these sports based algorithms and other popular algorithms such as Genetic Algorithm, Particle Swarm Optimization, and Differential Evolution within unconstrained global optimization benchmark problems with different characteristics has been performed for the first time. A general evaluation has also been discussed with further research directions.

43 citations


Cites background or methods from "Focusing on the Golden Ball Metaheu..."

  • ...Osaba et al. (2014b) have applied GBA to 62 new problems, namely the asymmetric traveling salesman problem, the vehicle routing problemwith backhauls, n-queen problem, and one-dimensional bin packing problem....

    [...]

  • ...…completely new and effective search and optimization procedures, with effective exploration capabilities in many cases, which are able to outperform existing classical and metaheuristic based optimization approaches (Kashan 2014; Bouchekara 2017; Razmjooy et al. 2016; Osaba et al. 2014a, b)....

    [...]

  • ...Any player can switch from his team to another by using a transfer procedure (Osaba et al. 2013, 2014a, b)....

    [...]

  • ...Finally, a season has as many training phases as matchdays (Osaba et al. 2014a)....

    [...]

  • ...…concepts, rules, and events in various sports, especially in football, have also been considered and modelled as novel efficient search and optimization methods with effective exploration capabilities in many cases (Kashan 2014; Bouchekara 2017; Razmjooy et al. 2016; Osaba et al. 2014a, b)....

    [...]

Journal ArticleDOI
TL;DR: The author’s modification suggested in this work enables the improvement of harmony memory during the time of running the Harmony Search technique, making it possible to increase the effectiveness of the technique by almost 59% and eliminate the imperfectness revealed in the previous research.
Abstract: This article constitutes the continuation of the work on adapting the Harmony Search algorithm to effectively solve the Asymmetric Traveling Salesman Problem (ATSP) instances. The author’s modification suggested in this work enables the improvement of harmony memory during the time of running the technique, making it possible to increase the effectiveness of the technique by almost 59% (the summary average error was reduced from 13.42 to 5.54%) and to eliminate the imperfectness revealed in the previous research. The article includes a description of the approach and a comparative study spanning various variants of implementing the improvement, conducted on 19 instances of ATSP, described by means of the occurrence of 17 to 443 nodes. The achieved results were also compared with the results found in reference articles, showing a significant effectiveness of the modification.

36 citations

Book ChapterDOI
01 Jan 2020
TL;DR: This chapter aims at making a step forward in the field proposing an experimentation hybridizing three different reputed bio-inspired computational metaheuristics (namely, particle swarm optimization, the firefly algorithm, and the bat algorithm) and the novelty search mechanism.
Abstract: The traveling salesman problem (TSP) is one of the most studied problems in computational intelligence and operations research. Since its first formulation, a myriad of works has been published proposing different alternatives for its solution. Additionally, a plethora of advanced formulations have also been proposed by the related practitioners, trying to enhance the applicability of the basic TSP. This chapter is firstly devoted to providing an informed overview on the TSP. For this reason, we first review the recent history of this research area, placing emphasis on milestone studies contributed in recent years. Next, we aim at making a step forward in the field proposing an experimentation hybridizing three different reputed bio-inspired computational metaheuristics (namely, particle swarm optimization, the firefly algorithm, and the bat algorithm) and the novelty search mechanism. For assessing the quality of the implemented methods, 15 different datasets taken from the well-known TSPLIB have been used. We end this chapter by sharing our envisioned status of the field, for which we identify opportunities and challenges which should stimulate research efforts in years to come.

30 citations

References
More filters
Book
01 Sep 1988
TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Abstract: From the Publisher: This book brings together - in an informal and tutorial fashion - the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required

52,797 citations

Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations

Journal ArticleDOI
TL;DR: This clearly written, mathematically rigorous text includes a novel algorithmic exposition of the simplex method and also discusses the Soviet ellipsoid algorithm for linear programming; efficient algorithms for network flow, matching, spanning trees, and matroids; the theory of NP-complete problems; approximation algorithms, local search heuristics for NPcomplete problems, more.
Abstract: This clearly written , mathematically rigorous text includes a novel algorithmic exposition of the simplex method and also discusses the Soviet ellipsoid algorithm for linear programming; efficient algorithms for network flow, matching, spanning trees, and matroids; the theory of NP-complete problems; approximation algorithms, local search heuristics for NPcomplete problems, more All chapters are supplemented by thoughtprovoking problems A useful work for graduate-level students with backgrounds in computer science, operations research, and electrical engineering Mathematicians wishing a self-contained introduction need look no further—American Mathematical Monthly 1982 ed

7,221 citations


"Focusing on the Golden Ball Metaheu..." refers background in this paper

  • ...There are several types of optimization, such as numerical [1], linear [2], continuous [3], or combinatorial optimization [4]....

    [...]

Journal ArticleDOI
TL;DR: Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm that is used for optimizing multivariable functions and the results showed that ABC outperforms the other algorithms.
Abstract: Swarm intelligence is a research branch that models the population of interacting agents or swarms that are able to self-organize. An ant colony, a flock of birds or an immune system is a typical example of a swarm system. Bees' swarming around their hive is another example of swarm intelligence. Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm. In this work, ABC algorithm is used for optimizing multivariable functions and the results produced by ABC, Genetic Algorithm (GA), Particle Swarm Algorithm (PSO) and Particle Swarm Inspired Evolutionary Algorithm (PS-EA) have been compared. The results showed that ABC outperforms the other algorithms.

6,377 citations