scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 2017"


Journal ArticleDOI
TL;DR: The experimental results confirm the efficiency of the proposed approaches in improving the classification accuracy compared to other wrapper-based algorithms, which insures the ability of WOA algorithm in searching the feature space and selecting the most informative attributes for classification tasks.

853 citations


Journal ArticleDOI
TL;DR: The main application is a nearly optimal lower bound on the complexity of any statistical query algorithm for detecting planted bipartite clique distributions when the planted clique has size O(n1/2 − δ) for any constant δ > 0.
Abstract: We introduce a framework for proving lower bounds on computational problems over distributions against algorithms that can be implemented using access to a statistical query oracle. For such algorithms, access to the input distribution is limited to obtaining an estimate of the expectation of any given function on a sample drawn randomly from the input distribution rather than directly accessing samples. Most natural algorithms of interest in theory and in practice, for example, moments-based methods, local search, standard iterative methods for convex optimization, MCMC, and simulated annealing, can be implemented in this framework. Our framework is based on, and generalizes, the statistical query model in learning theory [Kearns 1998].Our main application is a nearly optimal lower bound on the complexity of any statistical query algorithm for detecting planted bipartite clique distributions (or planted dense subgraph distributions) when the planted clique has size O(n1/2 − δ) for any constant δ > 0. The assumed hardness of variants of these problems has been used to prove hardness of several other problems and as a guarantee for security in cryptographic applications. Our lower bounds provide concrete evidence of hardness, thus supporting these assumptions.

178 citations


Journal ArticleDOI
TL;DR: The experiments have shown that the LAHC approach is simple, easy to implement and yet is an effective search procedure, and has an additional advantage (in contrast to the above cooling schedule based methods) in its scale independence.

177 citations


Journal ArticleDOI
01 Apr 2017
TL;DR: A simulated annealing (SA) heuristic is proposed to solve the hybrid vehicle routing problem (HVRP), which is an extension of the Green Vehicle Routing Problem (G-VRP) and results show that the proposed SA effectively solves HVRP.
Abstract: Display Omitted This research proposes the hybrid vehicle routing problem (HVRP), which is an extension of the green vehicle routing problem.A simulated annealing (SA) heuristic is proposed to solve HVRP.Computational results show that the proposed SA effectively solves HVRP.Sensitivity analysis has been conducted to understand the effect of hybrid vehicles and charging stations on the travel cost. This study proposes the Hybrid Vehicle Routing Problem (HVRP), which is an extension of the Green Vehicle Routing Problem (G-VRP). We focus on vehicles that use a hybrid power source, known as the Plug-in Hybrid Electric Vehicle (PHEV) and generate a mathematical model to minimize the total cost of travel by driving PHEV. Moreover, the model considers the utilization of electric and fuel power depending on the availability of either electric charging or fuel stations.We develop simulated annealing with a restart strategy (SA_RS) to solve this problem, and it consists of two versions. The first version determines the acceptance probability of a worse solution using the Boltzmann function, denoted as SA_RSBF. The second version employs the Cauchy function to determine the acceptance probability of a worse solution, denoted as SA_RSCF. The proposed SA algorithm is first verified with benchmark data of the capacitated vehicle routing problem (CVRP), with the result showing that it performs well and confirms its efficiency in solving CVRP. Further analysis show that SA_RSCF is preferable compared to SA_RSBF and that SA with a restart strategy performs better than without a restart strategy. We next utilize the SA_RSCF method to solve HVRP. The numerical experiment presents that vehicle type and the number of electric charging stations have an impact on the total travel cost.

158 citations


Proceedings ArticleDOI
02 May 2017
TL;DR: This paper presents a novel method for generating identical datasets which are identical over a number of statistical properties yet produce dissimilar graphs, and allows for control over the graphical appearance of resulting output.
Abstract: Datasets which are identical over a number of statistical properties, yet produce dissimilar graphs, are frequently used to illustrate the importance of graphical representations when exploring data. This paper presents a novel method for generating such datasets, along with several examples. Our technique varies from previous approaches in that new datasets are iteratively generated from a seed dataset through random perturbations of individual data points, and can be directed towards a desired outcome through a simulated annealing optimization strategy. Our method has the benefit of being agnostic to the particular statistical properties that are to remain constant between the datasets, and allows for control over the graphical appearance of resulting output.

144 citations


Journal ArticleDOI
TL;DR: The empirical analysis results show that the quality of the final results as well as the convergence rate of the new algorithm in some cases produced even more superior solutions than the best known TSP benchmarked results.
Abstract: Symbiotic Organisms Search (SOS) algorithm is an effective new metaheuristic search algorithm, which has recently recorded wider application in solving complex optimization problems. SOS mimics the symbiotic relationship strategies adopted by organisms in the ecosystem for survival. This paper, presents a study on the application of SOS with Simulated Annealing (SA) to solve the well-known traveling salesman problems (TSPs). The TSP is known to be NP-hard, which consist of a set of (n1)!/2 feasible solutions. The intent of the proposed hybrid method is to evaluate the convergence behaviour and scalability of the symbiotic organism's search with simulated annealing to solve both small and large-scale travelling salesman problems. The implementation of the SA based SOS (SOS-SA) algorithm was done in the MATLAB environment. To inspect the performance of the proposed hybrid optimization method, experiments on the solution convergence, average execution time, and percentage deviations of both the best and average solutions to the best known solution were conducted. Similarly, in order to obtain unbiased and comprehensive comparisons, descriptive statistics such as mean, standard deviation, minimum, maximum and range were used to describe each of the algorithms, in the analysis section. The Friedman's Test (with post hoc tests) was further used to compare the significant difference in performance between SOS-SA and the other selected state-of-the-art algorithms. The performances of SOS-SA and SOS are evaluated on different sets of TSP benchmarks obtained from TSPLIB (a library containing samples of TSP instances). The empirical analysis results show that the quality of the final results as well as the convergence rate of the new algorithm in some cases produced even more superior solutions than the best known TSP benchmarked results.

142 citations


Journal ArticleDOI
01 Jul 2017
TL;DR: A binary version of Black Hole Algorithm called BBHA is proposed for solving feature selection problem in biological data and demonstrates that Random Forest is the best decision tree algorithm and the proposed BBHA wrapper based feature selection approach outperforms the performances of other algorithms.
Abstract: Average solution quality of one filter and four wrapper approaches on 8 medical datasetsDisplay Omitted A binary version of the Black Hole Algorithm (BBHA) for solving discrete problems is proposed.Proposed algorithm was compared to 6 well known decision tree classifiers.Experimental results demonstrate that Random Forest is the best decision tree algorithmThe proposed BBHA wrapper based feature selection approach outperforms the performances of other algorithms.The proposed method also performed much faster, needs single parameter for configuring the model, and is simple to understand. Biological data often consist of redundant and irrelevant features. These features can lead to misleading in modeling the algorithms and overfitting problem. Without a feature selection method, it is difficult for the existing models to accurately capture the patterns on data. The aim of feature selection is to choose a small number of relevant or significant features to enhance the performance of the classification. Existing feature selection methods suffer from the problems such as becoming stuck in local optima and being computationally expensive. To solve these problems, an efficient global search technique is needed.Black Hole Algorithm (BHA) is an efficient and new global search technique, inspired by the behavior of black hole, which is being applied to solve several optimization problems. However, the potential of BHA for feature selection has not been investigated yet. This paper proposes a Binary version of Black Hole Algorithm called BBHA for solving feature selection problem in biological data. The BBHA is an extension of existing BHA through appropriate binarization. Moreover, the performances of six well-known decision tree classifiers (Random Forest (RF), Bagging, C5.0, C4.5, Boosted C5.0, and CART) are compared in this study to employ the best one as an evaluator of proposed algorithm.The performance of the proposed algorithm is tested upon eight publicly available biological datasets and is compared with Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Simulated Annealing (SA), and Correlation based Feature Selection (CFS) in terms of accuracy, sensitivity, specificity, Matthews Correlation Coefficient (MCC), and Area Under the receiver operating characteristic (ROC) Curve (AUC). In order to verify the applicability and generality of the BBHA, it was integrated with Naive Bayes (NB) classifier and applied on further datasets on the text and image domains.The experimental results confirm that the performance of RF is better than the other decision tree algorithms and the proposed BBHA wrapper based feature selection method is superior to BPSO, GA, SA, and CFS in terms of all criteria. BBHA gives significantly better performance than the BPSO and GA in terms of CPU Time, the number of parameters for configuring the model, and the number of chosen optimized features. Also, BBHA has competitive or better performance than the other methods in the literature.

135 citations


Journal ArticleDOI
01 Sep 2017
TL;DR: This work has tackled a real-world newspaper distribution problem with recycling policy as an asymmetric and clustered vehicle routing problem with simultaneous pickup and deliveries, variable costs and forbidden paths (AC-VRP-SPDVCFP), which is the first study of such a problem in the literature.
Abstract: A real-world newspaper distribution problem with recycling policy is tackled in this work. To meet all the complex restrictions contained in such a problem, it has been modeled as a rich vehicle routing problem, which can be more specifically considered as an asymmetric and clustered vehicle routing problem with simultaneous pickup and deliveries, variable costs and forbidden paths (AC-VRP-SPDVCFP). This is the first study of such a problem in the literature. For this reason, a benchmark composed by 15 instances has been also proposed. In the design of this benchmark, real geographical positions have been used, located in the province of Bizkaia, Spain. For the proper treatment of this AC-VRP-SPDVCFP, a discrete firefly algorithm (DFA) has been developed. This application is the first application of the firefly algorithm to any rich vehicle routing problem. To prove that the proposed DFA is a promising technique, its performance has been compared with two other well-known techniques: an evolutionary algorithm and an evolutionary simulated annealing. Our results have shown that the DFA has outperformed these two classic meta-heuristics.

126 citations


Journal ArticleDOI
TL;DR: A novel dynamic hybrid metaheuristic algorithm is developed for the formulated profit maximization problem, based on simulated annealing and particle swarm optimization that can guarantee that differentiated service qualities can be provided with higher overall performance and lower energy cost.
Abstract: A key factor of win–win cloud economy is how to trade off between the application performance from customers and the profit of cloud providers. Current researches on cloud resource allocation do not sufficiently address the issues of minimizing energy cost and maximizing revenue for various applications running in virtualized cloud data centers (VCDCs). This paper presents a new approach to optimize the profit of VCDC based on the service-level agreements (SLAs) between service providers and customers. A precise model of the external and internal request arrival rates is proposed for virtual machines at different service classes. An analytic probabilistic model is then developed for non-steady VCDC states. In addition, a smart controller is developed for fine-grained resource provisioning and sharing among multiple applications. Furthermore, a novel dynamic hybrid metaheuristic algorithm is developed for the formulated profit maximization problem, based on simulated annealing and particle swarm optimization. The proposed algorithm can guarantee that differentiated service qualities can be provided with higher overall performance and lower energy cost. The advantage of the proposed approach is validated with trace-driven simulations.

114 citations


Journal ArticleDOI
TL;DR: The proposed extension version of hill climbing method called $$\beta$$β-hill climbing is a very efficient enhancement to the hill climbing providing powerful results when it compares with other advanced methods using the same global optimization functions.
Abstract: Hill climbing method is an optimization technique that is able to build a search trajectory in the search space until reaching the local optima. It only accepts the uphill movement which leads it to easily get stuck in local optima. Several extensions to hill climbing have been proposed to overcome such problem such as Simulated Annealing, Tabu Search. In this paper, an extension version of hill climbing method has been proposed and called $$\beta$$ -hill climbing. A stochastic operator called $$\beta$$ -operator is utilized in hill climbing to control the balance between the exploration and exploitation during the search. The proposed method has been evaluated using IEEE-CEC2005 global optimization functions. The results show that the proposed method is a very efficient enhancement to the hill climbing providing powerful results when it compares with other advanced methods using the same global optimization functions.

106 citations


Journal ArticleDOI
01 Nov 2017
TL;DR: The proposed hybrid PSO-SA algorithm demonstrates improved performance in solution of these problems compared to other evolutionary methods and can reliably and effectively be used for various optimization problems.
Abstract: Display Omitted Development of a new hybrid PSO-SA optimization method.Numerical validation of the proposed method using a number of benchmark functions.Using three criteria for comparative work.Finding near optimum parameters of the proposed method.Application of the proposed algorithm in two engineering problems. A novel hybrid particle swarm and simulated annealing stochastic optimization method is proposed. The proposed hybrid method uses both PSO and SA in sequence and integrates the merits of good exploration capability of PSO and good local search properties of SA. Numerical simulation has been performed for selection of near optimum parameters of the method. The performance of this hybrid optimization technique was evaluated by comparing optimization results of thirty benchmark functions of different dimensions with those obtained by other numerical methods considering three criteria. These criteria were stability, average trial function evaluations for successful runs and the total average trial function evaluations considering both successful and failed runs. Design of laminated composite materials with required effective stiffness properties and minimum weight design of a three-bar truss are addressed as typical applications of the proposed algorithm in various types of optimization problems. In general, the proposed hybrid PSO-SA algorithm demonstrates improved performance in solution of these problems compared to other evolutionary methods The results of this research show that the proposed algorithm can reliably and effectively be used for various optimization problems.

Journal ArticleDOI
Yupeng Chen1, Ying Li1, Gang Wang1, Yuefeng Zheng1, Qian Xu1, Jiahao Fan1, Xueting Cui1 
TL;DR: Two novel BFO algorithms are proposed, which are named as adaptive chemotaxis bacterial foraging optimization algorithm (ACBFO) and improved swarming and elimination-dispersalacterial foraging optimize algorithm (ISEDBFO), which can provide important support for the expert and intelligent systems.
Abstract: ACBFO and ISEDBFO are proposed based on original bacterial foraging optimization.The modified chemotaxis step raises selected probability of primary features in ACBFO.Swarming equation and elimination dispersal step are improved in ISEDBFO.ACBFO and ISEDBFO promote the classification accuracy and convergence speed.The proposed algorithms significantly outperformed other six metaheuristic algorithms. Bacterial foraging optimization (BFO) algorithm is a new swarming intelligent method, which has a satisfactory performance in solving the continuous optimization problem based on the chemotaxis, swarming, reproduction and elimination-dispersal steps. However, BFO algorithm is rarely used to deal with feature selection problem. In this paper, we propose two novel BFO algorithms, which are named as adaptive chemotaxis bacterial foraging optimization algorithm (ACBFO) and improved swarming and elimination-dispersal bacterial foraging optimization algorithm (ISEDBFO) respectively. Two improvements are presented in ACBFO. On the one hand, in order to solve the discrete problem, data structure of each bacterium is redefined to establish the mapping relationship between the bacterium and the feature subset. On the other hand, an adaptive method for evaluating the importance of features is designed. Therefore the primary features in feature subset are preserved. ISEDBFO is proposed based on ACBFO. ISEDBFO algorithm also includes two modifications. First, with the aim of describing the nature of cell to cell attraction-repulsion relationship more accurately, swarming representation is improved by means of introducing the hyperbolic tangent function. Second, in order to retain the primary features of eliminated bacteria, roulette technique is applied to the elimination-dispersal phase.In this study, ACBFO and ISEDBFO are tested with 10 public data sets of UCI. The performance of the proposed methods is compared with particle swarm optimization based, genetic algorithm based, simulated annealing based, ant lion optimization based, binary bat algorithm based and cuckoo search based approaches. The experimental results demonstrate that the average classification accuracy of the proposed algorithms is nearly 3 percentage points higher than other tested methods. Furthermore, the improved algorithms reduce the length of the feature subset by almost 3 in comparison to other methods. In addition, the modified methods achieve excellent performance on wilcoxon signed-rank test and sensitivity-specificity test. In conclusion, the novel BFO algorithms can provide important support for the expert and intelligent systems.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed an adaptation of the imperialist competitive algorithm hybridized by a simulated annealing-based local search to solve the problem of flexible job-shop scheduling.

Journal ArticleDOI
01 Jun 2017-Energy
TL;DR: In this article, an elitist-Jaya algorithm is proposed to optimize the setup cost and operational cost of a shell-and-tube heat exchanger (STHE) simultaneously.

Journal ArticleDOI
TL;DR: In this article, the authors describe how real quantum annealers may be used to perform local (in state space) searches around specified states, rather than the global searches traditionally implemented in the QAA.
Abstract: I describe how real quantum annealers may be used to perform local (in state space) searches around specified states, rather than the global searches traditionally implemented in the quantum annealing algorithm (QAA). Such protocols will have numerous advantages over simple quantum annealing. By using such searches the effect of problem mis-specification can be reduced, as only energy differences between the searched states will be relevant. The QAA is an analogue of simulated annealing, a classical numerical technique which has now been superseded. Hence, I explore two strategies to use an annealer in a way which takes advantage of modern classical optimization algorithms. Specifically, I show how sequential calls to quantum annealers can be used to construct analogues of population annealing and parallel tempering which use quantum searches as subroutines. The techniques given here can be applied not only to optimization, but also to sampling. I examine the feasibility of these protocols on real devices and note that implementing such protocols should require minimal if any change to the current design of the flux qubit-based annealers by D-Wave Systems Inc. I further provide proof-of-principle numerical experiments based on quantum Monte Carlo that demonstrate simple examples of the discussed techniques.

Journal ArticleDOI
TL;DR: This work proposes an architecture in which the qubits are robustly encoded in continuous variable degrees of freedom and demonstrates the robustness of this architecture by simulating the optimal solution of a small instance of the nondeterministic polynomial-time hard (NP-hard) and fully connected number partitioning problem in the presence of dissipation.
Abstract: Quantum phenomena have the potential to speed up the solution of hard optimization problems. For example, quantum annealing, based on the quantum tunneling effect, has recently been shown to scale exponentially better with system size than classical simulated annealing. However, current realizations of quantum annealers with superconducting qubits face two major challenges. First, the connectivity between the qubits is limited, excluding many optimization problems from a direct implementation. Second, decoherence degrades the success probability of the optimization. We address both of these shortcomings and propose an architecture in which the qubits are robustly encoded in continuous variable degrees of freedom. By leveraging the phenomenon of flux quantization, all-to-all connectivity with sufficient tunability to implement many relevant optimization problems is obtained without overhead. Furthermore, we demonstrate the robustness of this architecture by simulating the optimal solution of a small instance of the nondeterministic polynomial-time hard (NP-hard) and fully connected number partitioning problem in the presence of dissipation.

Journal ArticleDOI
Jiaqiang Liu1, Yong Li1, Ying Zhang2, Li Su1, Depeng Jin1 
TL;DR: The formulation and proposed algorithms have no special assumption on network topology or policy specifications, therefore, they have broad range of applications in various types of networks such as enterprise, data center and broadband access networks.
Abstract: Previous works have proposed various approaches to implement service chaining by routing traffic through the desired middleboxes according to pre-defined policies. However, no matter what routing scheme is used, the performance of service chaining depends on where these middleboxes are placed. Thus, in this paper, we study middlebox placement problem, i.e., given network information and policy specifications, we attempt to determine the optimal locations to place the middleboxes so that the performance is optimized. The performance metrics studied in this paper include the end-to-end delay and the bandwidth consumption, which cover both users’ and network providers’ interests. We first formulate it as 0-1 programming problem, and prove it is NP-hard. We then propose two heuristic algorithms to obtain the sub-optimal solutions. The first algorithm is a greedy algorithm, and the second algorithm is based on simulated annealing. Through extensive simulations, we show that in comparison with a baseline algorithm, the proposed algorithms can reduce 22 percent end-to-end delay and save 38 percent bandwidth consumption on average. The formulation and proposed algorithms have no special assumption on network topology or policy specifications, therefore, they have broad range of applications in various types of networks such as enterprise, data center and broadband access networks.

Journal ArticleDOI
01 May 2017
TL;DR: The experimental results show that using DE in general and the proposed MPDE algorithm in particular are more convenient for fine-tuning NB than all other methods, including the other two metaheuristic methods (GA, and SA).
Abstract: Display Omitted Using three metaheuristic algorithms to solve the probability estimation problem of NB.Initial population is generated by a method used for fine-tuning the NB, namely, FTNB.DE algorithm using a multi-parent mutation and crossover operations (MPDE) is proposed.Three different methods are used to select the final solution of DE.Using MPDE achieves significant improvement over all other mothods. The Naive Bayes (NB) learning algorithm is simple and effective in many domains including text classification. However, its performance depends on the accuracy of the estimated conditional probability terms. Sometimes these terms are hard to be accurately estimated especially when the training data is scarce. This work transforms the probability estimation problem into an optimization problem, and exploits three metaheuristic approaches to solve it. These approaches are Genetic Algorithms (GA), Simulated Annealing (SA), and Differential Evolution (DE). We also propose a novel DE algorithm that uses multi-parent mutation and crossover operations (MPDE) and three different methods to select the final solution. We create an initial population by manipulating the solution generated by a method used for fine tuning the NB. We evaluate the proposed methods by using their resulted solutions to build NB classifiers and compare their results with the results of obtained from classical NB and Fine-Tuning Nave Bayesian (FTNB) algorithm, using 53 UCI benchmark data sets. We name these obtained classifiers NBGA, NBSA, NBDE, and NB-MPDE respectively. We also evaluate the performance NB-MPDE for text-classification using 18 text-classification data sets, and compare its results with the results of obtained from FTNB, BNB, and MNB. The experimental results show that using DE in general and the proposed MPDE algorithm in particular are more convenient for fine-tuning NB than all other methods, including the other two metaheuristic methods (GA, and SA). They also indicate that NB-MPDE achieves superiority over classical NB, FTNB, NBDE, NBGA, NBSA, MNB, and BNB.

Journal ArticleDOI
TL;DR: A new efficient and effective Shuffled Complex Evolutionary Global Optimization Algorithm with Principal Component Analysis – University of California Irvine (SP-UCI) is applied to the weight training process of a three-layer feed-forward ANN, suggesting that the SP- UCI algorithm possesses good potential in support of the weightTraining of ANN in real-word problems.

Posted Content
TL;DR: In this article, a new synthetic problem class that addresses the limitations of the Google inputs while retaining their strengths is provided, where more emphasis is placed on creating computational hardness through frustrated global interactions like those seen in interesting real-world inputs.
Abstract: A recent Google study [Phys. Rev. X, 6:031015 (2016)] compared a D-Wave 2X quantum processing unit (QPU) to two classical Monte Carlo algorithms: simulated annealing (SA) and quantum Monte Carlo (QMC). The study showed the D-Wave 2X to be up to 100 million times faster than the classical algorithms. The Google inputs are designed to demonstrate the value of collective multiqubit tunneling, a resource available to D-Wave QPUs but not to simulated annealing. But the computational hardness in these inputs is highly localized in gadgets, with only a small amount of complexity coming from global interactions, meaning that the relevance to real-world problems is limited. In this study we provide a new synthetic problem class that addresses the limitations of the Google inputs while retaining their strengths. We use simple clusters instead of more complex gadgets and more emphasis is placed on creating computational hardness through frustrated global interactions like those seen in interesting real-world inputs. The logical problems used to generate these inputs can be solved in polynomial time [J. Phys. A, 15:10 (1982)]. However, for general heuristic algorithms that are unaware of the planted problem class, the frustration creates meaningful difficulty in a controlled environment ideal for study. We use these inputs to evaluate the new 2000-qubit D-Wave QPU. We include the HFS algorithm---the best performer in a broader analysis of Google inputs---and we include state-of-the-art GPU implementations of SA and QMC. The D-Wave QPU solidly outperforms the software solvers: when we consider pure annealing time (computation time), the D-Wave QPU reaches ground states up to 2600 times faster than the competition. In the task of zero-temperature Boltzmann sampling from challenging multimodal inputs, the D-Wave QPU holds a similar advantage as quantum sampling bias does not seem significant.

Journal ArticleDOI
TL;DR: The proposed two-level no-split HEN synthesis hybrid method is able to present near-optimal solutions by more efficiently exploring the search space and using simple moves for local searches.

Journal ArticleDOI
TL;DR: This paper describes the experience with four hyper-heuristic selection and acceptance mechanisms namely Exponential Monte Carlo with counter (EMCQ), Choice Function (CF), Improvement Selection Rules (ISR), and newly developed Fuzzy Inference Selection (FIS), using the t-way test generation problem as a case study.

Journal ArticleDOI
TL;DR: Comprehensive review and evaluation of heuristics and meta-heuristics for the two-sided assembly line balancing problem and computational results demonstrate that the proper selection of encoding scheme, decoding procedure and objective function improves the performance of the algorithms by a significant margin.

Journal ArticleDOI
TL;DR: Wind Driven Optimization technique is proposed as the new method for identifying the parameters of solar PV and the obtained results clearly reveal that WDO algorithm can provide accurate optimized values with less number of iterations at different environmental conditions.

Journal Article
TL;DR: This paper presents the design of Takagi-Sugeno fuzzy controllers in state feedback form using swarm intelligence optimization algorithms using Particle Swarm Optimization, Simulated Annealing and Gravitational Search Algorithms.
Abstract: This paper presents the design of Takagi-Sugeno fuzzy controllers in state feedback form using swarm intelligence optimization algorithms. Three such algorithms are used: Particle Swarm Optimization, Simulated Annealing and Gravitational Search Algorithms. Sufficient stability conditions are expressed in terms of linear matrix inequalities considered as constraints in the optimization problem solved by swarm intelligence algorithms. Simulation results concerning an inverted pendulum system are given for illustration of the proposed design.

Journal ArticleDOI
TL;DR: A hybrid meta-heuristic method is presented that was able to provide the lowest costs solutions reported so far to six cases well studied in the literature and was written in C++, which is free and faster when compared to many other languages.
Abstract: Heat Exchanger Network (HEN) synthesis is an important field of study in process engineering. However, obtaining optimal HEN design is a complex task. When mathematically formulated, it may require sophisticated methods to achieve good solutions. The complexity increases even more for large-scale HEN. In this work, a hybrid meta-heuristic method is presented. A rather simple Simulated Annealing approach is used for the combinatorial level, while a strategy named Rocket Fireworks Optimization is developed and applied to the continuous domain. An advantage over other approaches is that the algorithm was written in C++, which is free and faster when compared to many other languages. The developed method was able to provide the lowest costs solutions reported so far to six cases well studied in the literature. An important feature of the approach here proposed is that, differently from other approaches, it does not split HEN into smaller problems during the optimization. This article is protected by copyright. All rights reserved.

Journal ArticleDOI
TL;DR: A joint location-inventory model for the network design of a supply chain with multiple Distribution Centers (DCs) and retailers is presented in this paper, where the uncertain natures of demands and replenishment lead times are incorporated into the model utilizing a queuing approach.

Journal ArticleDOI
TL;DR: A comparative study shows that the results obtained by GWO are either superior or competitive to the results that have been obtained by these well‐known metaheuristic mentioned earlier.
Abstract: For the past two decades, nature-inspired optimization algorithms have gained enormous popularity among the researchers. On the other hand, complex system reliability optimization problems, which are nonlinear programming problems in nature, are proved to be non-deterministic polynomial-time hard (NP-hard) from a computational point of view. In this work, few complex reliability optimization problems are solved by using a very recent nature-inspired metaheuristic called gray wolf optimizer (GWO) algorithm. GWO mimics the chasing, hunting, and the hierarchal behavior of gray wolves. The results obtained by GWO are compared with those of some recent and popular metaheuristic such as the cuckoo search algorithm, particle swarm optimization, ant colony optimization, and simulated annealing. This comparative study shows that the results obtained by GWO are either superior or competitive to the results that have been obtained by these well-known metaheuristic mentioned earlier. Copyright © 2016 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A Harmony Search-Based Remodularization Algorithm (HSBRA) is proposed to solve the software remodularization problem for object-oriented software (OOS) systems and shows that HSBRA outperforms SA, HC, and GA algorithms and performs better than ABC algorithms.

Journal ArticleDOI
15 Dec 2017-Energy
TL;DR: In this paper, a breeder hybrid algorithm consisting of the constitution of nonlinear regression-based breeder genetic algorithm and simulated annealing is proposed for the objective of forecasting the natural gas demand with a smaller error rate.