scispace - formally typeset
Search or ask a question

Showing papers on "Metaheuristic published in 2014"


Journal ArticleDOI
TL;DR: The results of the classical engineering design problems and real application prove that the proposed GWO algorithm is applicable to challenging problems with unknown search spaces.

10,082 citations


Book ChapterDOI
01 Jan 2014
TL;DR: This chapter provides an overview of the fundamentals of algorithms and their links to self-organization, exploration, and exploitation.
Abstract: Algorithms are important tools for solving problems computationally. All computation involves algorithms, and the efficiency of an algorithm largely determines its usefulness. This chapter provides an overview of the fundamentals of algorithms and their links to self-organization, exploration, and exploitation. A brief history of recent nature-inspired algorithms for optimization is outlined in this chapter.

8,285 citations


Journal ArticleDOI
TL;DR: Results confirm the excellent performance of the SOS method in solving various complex numerical problems and compared with well-known optimization methods.

1,152 citations


Book ChapterDOI
01 Jan 2014
TL;DR: This chapter discusses the fundamental principles of multi-objective optimization, the differences between multi-Objective optimization and single-objectives optimization, and describes a few well-known classical and evolutionary algorithms for multi- objective optimization.
Abstract: Multi-objective optimization is an integral part of optimization activities and has a tremendous practical importance, since almost all real-world optimization problems are ideally suited to be modeled using multiple conflicting objectives. The classical means of solving such problems were primarily focused on scalarizing multiple objectives into a single objective, whereas the evolutionary means have been to solve a multi-objective optimization problem as it is. In this chapter, we discuss the fundamental principles of multi-objective optimization, the differences between multi-objective optimization and single-objective optimization, and describe a few well-known classical and evolutionary algorithms for multi-objective optimization. Two application case studies reveal the importance of multi-objective optimization in practice. A number of research challenges are then highlighted. The chapter concludes by suggesting a few tricks of the trade and mentioning some key resources to the field of multi-objective optimization.

1,072 citations


Book
17 Feb 2014
TL;DR: This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences, and researchers and engineers as well as experienced experts will also find it a handy reference.
Abstract: Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning and control, as well as multi-objective optimization. This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences. It can also serve a source of inspiration for new applications. Researchers and engineers as well as experienced experts will also find it a handy reference.Discusses and summarizes the latest developments in nature-inspired algorithms with comprehensive, timely literatureProvides a theoretical understanding as well as practical implementation hintsProvides a step-by-step introduction to each algorithm

901 citations


Journal ArticleDOI
TL;DR: The fundamental ideas of cuckoo search are reviewed and the latest developments as well as its applications are reviewed, and insight into its search mechanisms is gained.
Abstract: Cuckoo search (CS) is a relatively new algorithm, developed by Yang and Deb in 2009, and CS is efficient in solving global optimization problems. In this paper, we review the fundamental ideas of cuckoo search and the latest developments as well as its applications. We analyze the algorithm and gain insight into its search mechanisms and find out why it is efficient. We also discuss the essence of algorithms and its link to self-organizing systems, and finally we propose some important topics for further research.

762 citations


Journal ArticleDOI
TL;DR: Cuckoo search (CS) is a relatively new algorithm, developed by Yang and Deb in 2009, and the same has been found to be efficient in solving global optimization problems.
Abstract: Cuckoo search (CS) is a relatively new algorithm, developed by Yang and Deb in 2009, and the same has been found to be efficient in solving global optimization problems. In this paper, we review the fundamental ideas of cuckoo search and the latest developments as well as its applications. We analyze the algorithm and gain insight into its search mechanisms and find out why it is efficient. We also discuss the essence of algorithms and its link to self-organizing systems, and finally, we propose some important topics for further research.

582 citations


Journal ArticleDOI
TL;DR: Results indicate that the hybrid approach achieves better solutions compared to others, and that crowding distance method for LSP outperforms the former Grids method.

492 citations


Journal ArticleDOI
TL;DR: An up-to-date review of all major nature inspired metaheuristic algorithms employed till date for partitional clustering and key issues involved during formulation of various metaheuristics as a clustering problem and major application areas are discussed.
Abstract: The partitional clustering concept started with K-means algorithm which was published in 1957. Since then many classical partitional clustering algorithms have been reported based on gradient descent approach. The 1990 kick started a new era in cluster analysis with the application of nature inspired metaheuristics. After initial formulation nearly two decades have passed and researchers have developed numerous new algorithms in this field. This paper embodies an up-to-date review of all major nature inspired metaheuristic algorithms employed till date for partitional clustering. Further, key issues involved during formulation of various metaheuristics as a clustering problem and major application areas are discussed.

457 citations


Journal ArticleDOI
TL;DR: A comparison of the proposed algorithm with other algorithms has been made, which shows that the FPA is efficient with a good convergence rate, and the importance for further parametric studies and theoretical analysis is highlighted and discussed.
Abstract: Multiobjective design optimization problems require multiobjective optimization techniques to solve, and it is often very challenging to obtain high-quality Pareto fronts accurately. In this article, the recently developed flower pollination algorithm (FPA) is extended to solve multiobjective optimization problems. The proposed method is used to solve a set of multiobjective test functions and two bi-objective design benchmarks, and a comparison of the proposed algorithm with other algorithms has been made, which shows that the FPA is efficient with a good convergence rate. Finally, the importance for further parametric studies and theoretical analysis is highlighted and discussed.

454 citations


Journal ArticleDOI
TL;DR: Chaos is introduced into Bat algorithm so as to increase its global search mobility for robust global optimization and results show that some variants of chaotic BAs can clearly outperform the standard BA for these benchmarks.

Book ChapterDOI
17 Oct 2014
TL;DR: In this paper, a new bio-inspired algorithm, chicken swarm optimization (CSO), is proposed for optimization applications, which mimics the hierarchal order in the chicken swarm and the behaviors of the chicken swarms, including roosters, hens and chicks.
Abstract: A new bio-inspired algorithm, Chicken Swarm Optimization (CSO), is proposed for optimization applications. Mimicking the hierarchal order in the chicken swarm and the behaviors of the chicken swarm, including roosters, hens and chicks, CSO can efficiently extract the chickens’ swarm intelligence to optimize problems. Experiments on twelve benchmark problems and a speed reducer design were conducted to compare the performance of CSO with that of other algorithms. The results show that CSO can achieve good optimization results in terms of both optimization accuracy and robustness. Future researches about CSO are finally suggested.

Journal ArticleDOI
TL;DR: An improved and discrete version of the Cuckoo Search (CS) algorithm is presented to solve the famous traveling salesman problem (TSP), an NP-hard combinatorial optimisation problem.
Abstract: In this paper, we present an improved and discrete version of the Cuckoo Search (CS) algorithm to solve the famous traveling salesman problem (TSP), an NP-hard combinatorial optimisation problem. CS is a metaheuristic search algorithm which was recently developed by Xin-She Yang and Suash Deb in 2009, inspired by the breeding behaviour of cuckoos. This new algorithm has proved to be very effective in solving continuous optimisation problems. We now extend and improve CS by reconstructing its population and introducing a new category of cuckoos so that it can solve combinatorial problems as well as continuous problems. The performance of the proposed discrete cuckoo search (DCS) is tested against a set of benchmarks of symmetric TSP from the well-known TSPLIB library. The results of the tests show that DCS is superior to some other metaheuristics.

Journal ArticleDOI
TL;DR: Most of the papers in the field of supply chain network design focus on economic performance, but recently, some studies have considered environmental dimensions.

Journal ArticleDOI
TL;DR: The proposed interior search algorithm (ISA) is inspired by interior design and decoration and it only has one parameter to tune and can outperform the other well-known algorithms.
Abstract: This paper presents the interior search algorithm (ISA) as a novel method for solving optimization tasks. The proposed ISA is inspired by interior design and decoration. The algorithm is different from other metaheuristic algorithms and provides new insight for global optimization. The proposed method is verified using some benchmark mathematical and engineering problems commonly used in the area of optimization. ISA results are further compared with well-known optimization algorithms. The results show that the ISA is efficiently capable of solving optimization problems. The proposed algorithm can outperform the other well-known algorithms. Further, the proposed algorithm is very simple and it only has one parameter to tune.

Journal ArticleDOI
TL;DR: The literature on the unconstrained binary quadratic program is surveyed, providing an overview of the applications and solution methods.
Abstract: In recent years the unconstrained binary quadratic program (UBQP) has grown in importance in the field of combinatorial optimization due to its application potential and its computational challenge. Research on UBQP has generated a wide range of solution techniques for this basic model that encompasses a rich collection of problem types. In this paper we survey the literature on this important model, providing an overview of the applications and solution methods.

Journal ArticleDOI
TL;DR: The proposed Unified Hybrid Genetic Search metaheuristic relies on problem-independent unified local search, genetic operators, and advanced diversity management methods and shows remarkable performance, which matches or outperforms the current state-of-the-art problem-tailored algorithms.

Journal ArticleDOI
TL;DR: This work proposes a new method for solving chance constrained optimization problems that lies between robust optimization and scenario-based methods, and imposes certain assumptions on the dependency of the constraint functions with respect to the uncertainty.
Abstract: We propose a new method for solving chance constrained optimization problems that lies between robust optimization and scenario-based methods. Our method does not require prior knowledge of the underlying probability distribution as in robust optimization methods, nor is it based entirely on randomization as in the scenario approach. It instead involves solving a robust optimization problem with bounded uncertainty, where the uncertainty bounds are randomized and are computed using the scenario approach. To guarantee that the resulting robust problem is solvable we impose certain assumptions on the dependency of the constraint functions with respect to the uncertainty and show that tractability is ensured for a wide class of systems. Our results lead immediately to guidelines under which the proposed methodology or the scenario approach is preferable in terms of providing less conservative guarantees or reducing the computational cost.

Journal ArticleDOI
01 Oct 2014
TL;DR: Quick artificial bee colony (qABC) is a new version of ABC algorithm which models the behaviour of onlooker bees more accurately and improves the performance of standard ABC in terms of local search ability.
Abstract: Artificial bee colony (ABC) algorithm inspired by the foraging behaviour of the honey bees is one of the most popular swarm intelligence based optimization techniques. Quick artificial bee colony (qABC) is a new version of ABC algorithm which models the behaviour of onlooker bees more accurately and improves the performance of standard ABC in terms of local search ability. In this study, the qABC method is described and its performance is analysed depending on the neighbourhood radius, on a set of benchmark problems. And also some analyses about the effect of the parameter limit and colony size on qABC optimization are carried out. Moreover, the performance of qABC is compared with the state of art algorithms' performances.

Journal ArticleDOI
01 Nov 2014
TL;DR: The present study is the first ever comprehensive review on ICA, which indicates a statistically significant increase in the amount of published research on this metaheuristic algorithm, especially research addressing discrete optimization problems.
Abstract: This is the first paper that reviews the application of Imperialist Competitive Algorithm in different engineering disciplines.The development trend of the ICA's applications is analyzed statistically in order to show its popularity.Future research opportunities and directions are discussed to motivate the future researchers. The Imperialist Competitive Algorithm (ICA), derived from the field of human social evolution, is a component of swarm intelligence theory. It was first introduced in 2007 to deal with continuous optimization problems, but recently has been extensively applied to solve discrete optimization problems. This paper reviews the underlying ideas of how ICA emerged and its application to the engineering disciplines mainly on industrial engineering. The present study is the first ever comprehensive review on ICA, which indicates a statistically significant increase in the amount of published research on this metaheuristic algorithm, especially research addressing discrete optimization problems. Future research directions and trends are also described.

Journal ArticleDOI
TL;DR: A DE algorithm is proposed that uses a new mechanism to dynamically select the best performing combinations of parameters for a problem during the course of a single run and shows better performance over the state-of-the-art algorithms.
Abstract: Over the last few decades, a number of differential evolution (DE) algorithms have been proposed with excellent performance on mathematical benchmarks. However, like any other optimization algorithm, the success of DE is highly dependent on the search operators and control parameters that are often decided a priori. The selection of the parameter values is itself a combinatorial optimization problem. Although a considerable number of investigations have been conducted with regards to parameter selection, it is known to be a tedious task. In this paper, a DE algorithm is proposed that uses a new mechanism to dynamically select the best performing combinations of parameters (amplification factor, crossover rate, and the population size) for a problem during the course of a single run. The performance of the algorithm is judged by solving three well known sets of optimization test problems (two constrained and one unconstrained). The results demonstrate that the proposed algorithm not only saves the computational time, but also shows better performance over the state-of-the-art algorithms. The proposed mechanism can easily be applied to other population-based algorithms.

Journal ArticleDOI
TL;DR: A novel multi-strategy ensemble ABC (MEABC) algorithm, where a pool of distinct solution search strategies coexists throughout the search process and competes to produce offspring.

Journal ArticleDOI
TL;DR: To solve the single-objective constrained optimization problem, an exact solution method as well as a ''math-heuristic'' technique building on a MILP formulation with a heuristically generated constraint pool are proposed.

Journal ArticleDOI
TL;DR: A recently developed discrete firefly algorithm is extended to solve hybrid flowshop scheduling problems with two objectives and shows that the proposed algorithm outperforms many other metaheuristics in the literature.
Abstract: Hybrid flowshop scheduling problems include the generalization of flowshops with parallel machines in some stages. Hybrid flowshop scheduling problems are known to be NP-hard. Hence, researchers have proposed many heuristics and metaheuristic algorithms to tackle such challenging tasks. In this letter, a recently developed discrete firefly algorithm is extended to solve hybrid flowshop scheduling problems with two objectives. Makespan and mean flow time are the objective functions considered. Computational experiments are carried out to evaluate the performance of the proposed algorithm. The results show that the proposed algorithm outperforms many other metaheuristics in the literature.

Journal ArticleDOI
TL;DR: In this paper, the authors used the teaching-learning-based optimization technique to solve the optimal power flow problem with different complexities and compared the obtained results with those obtained using other techniques reported in the literature.

Journal ArticleDOI
TL;DR: An ant colony optimization (ACO) algorithm that extends the ACOR algorithm for continuous optimization to tackle mixed-variable optimization problems, and a novel procedure to generate artificial, mixed- variable benchmark functions that is used to automatically tune ACOMV's parameters.
Abstract: In this paper, we introduce ACO MV : an ant colony optimization (ACO) algorithm that extends the ACO R algorithm for continuous optimization to tackle mixed-variable optimization problems. In ACO MV , the decision variables of an optimization problem can be explicitly declared as continuous, ordinal, or categorical, which allows the algorithm to treat them adequately. ACO MV includes three solution generation mechanisms: a continuous optimization mechanism (ACO R ), a continuous relaxation mechanism (ACO MV -o) for ordinal variables, and a categorical optimization mechanism (ACO MV -c) for categorical variables. Together, these mechanisms allow ACO MV to tackle mixed-variable optimization problems. We also define a novel procedure to generate artificial, mixed-variable benchmark functions, and we use it to automatically tune ACO MV 's parameters. The tuned ACO MV is tested on various real-world continuous and mixed-variable engineering optimization problems. Comparisons with results from the literature demonstrate the effectiveness and robustness of ACO MV on mixed-variable optimization problems.

Journal ArticleDOI
TL;DR: This paper presents a scatter search (SS) method for this problem to optimize makespan and shows that the proposed scatter search algorithm produces better results than existing algorithms by a significant margin.

Journal Article
TL;DR: BayesOpt as mentioned in this paper is a library with state-of-the-art Bayesian optimization methods to solve nonlinear optimization, stochastic bandits or sequential experimental design problems.
Abstract: BayesOpt is a library with state-of-the-art Bayesian optimization methods to solve nonlinear optimization, stochastic bandits or sequential experimental design problems Bayesian optimization characterized for being sample efficient as it builds a posterior distribution to capture the evidence and prior knowledge of the target function Built in standard C++, the library is extremely efficient while being portable and flexible It includes a common interface for C, C++, Python, Matlab and Octave

Journal ArticleDOI
TL;DR: This paper begins with a brief retrospect of traditional scheduling, followed by a detailed review of metaheuristic algorithms for solving the scheduling problems by placing them in a unified framework.
Abstract: Cloud computing has become an increasingly important research topic given the strong evolution and migration of many network services to such computational environment. The problem that arises is related with efficiency management and utilization of the large amounts of computing resources. This paper begins with a brief retrospect of traditional scheduling, followed by a detailed review of metaheuristic algorithms for solving the scheduling problems by placing them in a unified framework. Armed with these two technologies, this paper surveys the most recent literature about metaheuristic scheduling solutions for cloud. In addition to applications using metaheuristics, some important issues and open questions are presented for the reference of future researches on scheduling for cloud.

Journal ArticleDOI
TL;DR: This paper proposes two mathematical programming formulations which generalize the non-periodic train timetabling problem on a single line under a dynamic demand pattern and introduces a fast adaptive large neighborhood search (ALNS) metaheuristic, demonstrating the computational superiority of the ALNS compared with a truncated branch-and-cut algorithm.
Abstract: Railway planning is a complex activity which is usually decomposed into several stages, traditionally network design, line design, timetabling, rolling stock, and staffing. In this paper, we study the design and optimization of train timetables for a rail rapid transit (RRT) line adapted to a dynamic demand environment, which focuses on creating convenient timetables for passengers. The objective is to minimize the average passenger waiting time at the stations, thus focusing on passenger welfare. We first propose two mathematical programming formulations which generalize the non-periodic train timetabling problem on a single line under a dynamic demand pattern. We then analyze the properties of the problem before introducing a fast adaptive large neighborhood search (ALNS) metaheuristic in order to solve large instances of the problem within short computation times. The algorithm yields timetables that may not be regular or periodic, but are adjusted to a dynamic demand behavior. Through extensive computational experiments on artificial and real-world based instances, we demonstrate the computational superiority of our ALNS compared with a truncated branch-and-cut algorithm. The average reduction in passenger waiting times is 26%, while the computational time of our metaheuristic is less than 1% of that required by the alternative CPLEX-based algorithm. Out of 120 open instances, we obtain 84 new best known solutions and we reach the optimum on 10 out of 14 instances with known optimal solutions.