scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 2009"


Book
14 Aug 2009
TL;DR: In this article, a cost minimization model for the design of water distribution networks is presented, which uses a recently developed harmony search optimization algorithm while satisfying all the design constraints, including pipe diameters in a water distribution network.
Abstract: This study presents a cost minimization model for the design of water distribution networks. The model uses a recently developed harmony search optimization algorithm while satisfying all the design constraints. The harmony search algorithm mimics a jazz improvisation process in order to find better design solutions, in this case pipe diameters in a water distribution network. The model also interfaces with a popular hydraulic simulator, EPANET, to check the hydraulic constraints. If the design solution vector violates the hydraulic constraints, the amount of violation is considered in the cost function as a penalty. The model was applied to five water distribution networks, and obtained designs that were either the same or cost 0.28–10.26% less than those of competitive meta-heuristic algorithms, such as the genetic algorithm, simulated annealing and tabu search under similar or less favorable conditions. The results show that the harmony search-based model is suitable for water network design.

516 citations


Proceedings ArticleDOI
17 Aug 2009
TL;DR: A Virtual Network Mapping (VNM) algorithm based on subgraph isomorphism detection: it maps nodes and links during the same stage and is faster than the two stage approach, especially for large virtual networks with high resource consumption which are hard to map.
Abstract: Assigning the resources of a virtual network to the components of a physical network, called Virtual Network Mapping, plays a central role in network virtualization. Existing approaches use classical heuristics like simulated annealing or attempt a two stage solution by solving the node mapping in a first stage and doing the link mapping in a second stage.The contribution of this paper is a Virtual Network Mapping (VNM) algorithm based on subgraph isomorphism detection: it maps nodes and links during the same stage. Our experimental evaluations show that this method results in better mappings and is faster than the two stage approach, especially for large virtual networks with high resource consumption which are hard to map.

502 citations


Journal ArticleDOI
TL;DR: A variety of metaheuristic search techniques are found to be applicable for non-functional testing including simulated annealing, tabu search, genetic algorithms, ant colony methods, grammatical evolution, genetic programming and swarm intelligence methods.
Abstract: Search-based software testing is the application of metaheuristic search techniques to generate software tests. The test adequacy criterion is transformed into a fitness function and a set of solutions in the search space are evaluated with respect to the fitness function using a metaheuristic search technique. The application of metaheuristic search techniques for testing is promising due to the fact that exhaustive testing is infeasible considering the size and complexity of software under test. Search-based software testing has been applied across the spectrum of test case design methods; this includes white-box (structural), black-box (functional) and grey-box (combination of structural and functional) testing. In addition, metaheuristic search techniques have also been applied to test non-functional properties. The overall objective of undertaking this systematic review is to examine existing work into non-functional search-based software testing (NFSBST). We are interested in types of non-functional testing targeted using metaheuristic search techniques, different fitness functions used in different types of search-based non-functional testing and challenges in the application of these techniques. The systematic review is based on a comprehensive set of 35 articles obtained after a multi-stage selection process and have been published in the time span 1996-2007. The results of the review show that metaheuristic search techniques have been applied for non-functional testing of execution time, quality of service, security, usability and safety. A variety of metaheuristic search techniques are found to be applicable for non-functional testing including simulated annealing, tabu search, genetic algorithms, ant colony methods, grammatical evolution, genetic programming (and its variants including linear genetic programming) and swarm intelligence methods. The review reports on different fitness functions used to guide the search for each of the categories of execution time, safety, usability, quality of service and security; along with a discussion of possible challenges in the application of metaheuristic search techniques.

421 citations


Journal ArticleDOI
TL;DR: This investigation elucidates the feasibility of applying chaotic particle swarm optimization (CPSO) algorithm to choose the suitable parameter combination for a SVR model and outperforms the other two models applying other algorithms, genetic algorithm (GA) and simulated annealing algorithm (SA).

269 citations


Journal ArticleDOI
TL;DR: A solution procedure based on steady-state genetic algorithms (ssGA) with a new encoding structure for the design of a single-source, multi-product,Multi-stage SCN is presented.

244 citations


Journal ArticleDOI
TL;DR: The results reveal that simulated annealing and evolution strategies are the most powerful techniques, and harmony search and simple genetic algorithm methods can be characterized by slow convergence rates and unreliable search performance in large-scale problems.

219 citations


Journal ArticleDOI
TL;DR: A novel strategy for the control of the Particle Swarm Optimization (PSO) parameters based on the Nelder-Mead algorithm (Simplex method) is presented; consequently, the convergence of the PSOS becomes independent of the heuristic constants and its stability and confidence are enhanced.

196 citations


Journal ArticleDOI
TL;DR: A metaheuristic based on simulated annealing which strikes a compromise between intensification and diversification mechanisms to augment the competitive performance of the proposed SA is applied.
Abstract: In this communication, we strive to apply a novel simulated annealing to consider scheduling hybrid flowshop problems to minimize both total completion time and total tardiness. To narrow the gap between the theory and the practice of the hybrid flowshop scheduling, we integrate two realistic and practical assumptions which are sequence-dependent setup and transportation times into our problem. We apply a metaheuristic based on simulated annealing (SA) which strikes a compromise between intensification and diversification mechanisms to augment the competitive performance of our proposed SA. A comprehensive calibration of different parameters and operators are done. We employ Taguchi method to select the optimum parameters with the least possible number of experiments. For the purpose of performance evaluation of our proposed algorithm, we generate a benchmark against which the adaptations of high performing algorithms in the literature are brought into comparison. Moreover, we investigate the impacts of increase of number of jobs on the performance of our algorithm. The efficiency and effectiveness of our hybrid simulated annealing are inferred from all the computational results obtained in various situations.

177 citations


Journal ArticleDOI
TL;DR: The IPPS problem has been developed as a combinatorial optimisation model, and a modern evolutionary algorithm, i.e., the particle swarm optimisation (PSO) algorithm, has been modified and applied to solve it effectively.
Abstract: Integration of process planning and scheduling (IPPS) is an important research issue to achieve manufacturing planning optimisation In both process planning and scheduling, vast search spaces and complex technical constraints are significant barriers to the effectiveness of the processes In this paper, the IPPS problem has been developed as a combinatorial optimisation model, and a modern evolutionary algorithm, ie, the particle swarm optimisation (PSO) algorithm, has been modified and applied to solve it effectively Initial solutions are formed and encoded into particles of the PSO algorithm The particles ''fly'' intelligently in the search space to achieve the best sequence according to the optimisation strategies of the PSO algorithm Meanwhile, to explore the search space comprehensively and to avoid being trapped into local optima, several new operators have been developed to improve the particles' movements to form a modified PSO algorithm Case studies have been conducted to verify the performance and efficiency of the modified PSO algorithm A comparison has been made between the result of the modified PSO algorithm and the previous results generated by the genetic algorithm (GA) and the simulated annealing (SA) algorithm, respectively, and the different characteristics of the three algorithms are indicated Case studies show that the developed PSO can generate satisfactory results in both applications

169 citations


Journal ArticleDOI
TL;DR: Genetic Design through Local Search (GDLS), a scalable, heuristic, algorithmic method that employs an approach based on local search with multiple search paths, which results in effective, low‐complexity search of the space of genetic manipulations.
Abstract: In the past decade, computational methods have been shown to be well suited to unraveling the complex web of metabolic reactions in biological systems. Methods based on flux–balance analysis (FBA) and bi-level optimization have been used to great effect in aiding metabolic engineering. These methods predict the result of genetic manipulations and allow for the best set of manipulations to be found computationally. Bi-level FBA is, however, limited in applicability because the required computational time and resources scale poorly as the size of the metabolic system and the number of genetic manipulations increase. To overcome these limitations, we have developed Genetic Design through Local Search (GDLS), a scalable, heuristic, algorithmic method that employs an approach based on local search with multiple search paths, which results in effective, low-complexity search of the space of genetic manipulations. Thus, GDLS is able to find genetic designs with greater in silico production of desired metabolites than can feasibly be found using a globally optimal search and performs favorably in comparison with heuristic searches based on evolutionary algorithms and simulated annealing.

158 citations


Journal ArticleDOI
TL;DR: A new mathematical model and a simulated annealing algorithm are presented for the mixed-model two-sided assembly line balancing problem and the experimental results show that the proposed approach performs well.

Journal ArticleDOI
TL;DR: In this article, a hybrid of genetic algorithm and simulated annealing (GA-SA Hybrid) is proposed for generic multi-project scheduling problems with multiple resource constraints, which has better performance than GA, SA, MSA and some most popular heuristic methods.

Journal ArticleDOI
TL;DR: In this paper, the authors describe a methodology to design reinforced concrete (RC) building frames based on minimum embedded CO 2 emissions and the economic cost of RC framed structures, which involves optimization by a simulated annealing (SA) algorithm applied to two objective functions, namely the embedded carbon dioxide emissions and economic cost.

Journal ArticleDOI
TL;DR: The results showed that the genetic algorithm produces a better calibrated model than parallel simulated annealing and the model that contains all primary drivers and all interactions produced the best performing calibrated model overall.

Journal ArticleDOI
TL;DR: A new iterative heuristic for the two-dimensional knapsack problem based on the sequence pair representation proposed by Murata et al. is presented and able to handle problem instances where rotation is allowed.

Journal ArticleDOI
TL;DR: The results obtained from the computational study have shown that the multi-phase algorithm is a viable and effective approach and several new approaches to evaluate non-dominated solution sets are suggested.
Abstract: This paper considers the problem of sequence-dependent setup time hybrid flowshop scheduling with the objectives of minimizing the makespan and sum of the earliness and tardiness of jobs, and present a multi-phase method. In initial phase, the population will be decomposed into several subpopulations. In this phase we develop a random key genetic algorithm and the goal is to obtain a good approximation of the Pareto-front. In the second phase, for improvement the Pareto-front, non-dominant solutions will be unified as one big population. In this phase, based on the local search in Pareto space concept, we propose multi-objective hybrid metaheuristic. Finally in phase 3, we propose a novel method using e-constraint covering hybrid metaheuristic to cover the gaps between the non-dominated solutions and improve Pareto-front. Generally in three phases, we consider appropriate combinations of multi-objective methods to improve the total performance. The hybrid algorithm used in phases 2 and 3 combines elements from both simulated annealing and a variable neighborhood search. The aim of using a hybrid metaheuristic is to raise the level of generality so as to be able to apply the same solution method to several problems. Furthermore, in this study to evaluate non-dominated solution sets, we suggest several new approaches. The non-dominated sets obtained from each phase and global archive sub-population genetic algorithm presented previously in the literature are compared. The results obtained from the computational study have shown that the multi-phase algorithm is a viable and effective approach.

Journal ArticleDOI
TL;DR: A hybrid metaheuristic for the minimization of makespan in scheduling problems with parallel machines and sequence-dependent setup times by hybridization of an ACO, SA with VNS, combining the advantages of these three individual components.
Abstract: This paper proposes a hybrid metaheuristic for the minimization of makespan in scheduling problems with parallel machines and sequence-dependent setup times. The solution approach is robust, fast, and simply structured, and comprises three components: an initial population generation method based on an ant colony optimization (ACO), a simulated annealing (SA) for solution evolution, and a variable neighborhood search (VNS) which involves three local search procedures to improve the population. The hybridization of an ACO, SA with VNS, combining the advantages of these three individual components, is the key innovative aspect of the approach. Two algorithms of a hybrid VNS-based algorithm, SA/VNS and ACO/VNS, and the VNS algorithm presented previously are used to compare with the proposed hybrid algorithm to highlight its advantages in terms of generality and quality for large instances.

Proceedings ArticleDOI
20 Apr 2009
TL;DR: An analytical model is proposed to estimate the lifetime reliability of multiprocessor platforms when executing periodical tasks, and a novel lifetime reliability-aware task allocation and scheduling algorithm based on simulated annealing technique is presented.
Abstract: With the relentless scaling of semiconductor technology, the lifetime reliability of embedded multiprocessor platforms has become one of the major concerns for the industry. If this is not taken into consideration during the task allocation and scheduling process, some processors might age much faster than the others and become the reliability bottleneck for the system, thus significantly reducing the system's service life. To tackle this problem, in this paper, we propose an analytical model to estimate the lifetime reliability of multiprocessor platforms when executing periodical tasks, and we present a novel lifetime reliability-aware task allocation and scheduling algorithm based on simulated annealing technique. In addition, to speed up the annealing process, several techniques are proposed to simplify the design space exploration process with satisfactory solution quality. Experimental results on various multiprocessor platforms and task graphs demonstrate the efficacy of the proposed approach.

Journal ArticleDOI
TL;DR: Proposing equations analogous to those of the classical PSO equations, a discrete PSO algorithm (DPSO) is presented to minimize makespan (C"m"a"x) criterion and results signify that the proposed DPSO algorithm is very competitive and can be rapidly guided when hybridizing with a local search heuristic.

Journal ArticleDOI
TL;DR: A hybrid algorithm of simulated annealing and tabu search is applied to solve capacitated vehicle routing problem and shows that the proposed algorithm is competitive with other existing algorithms for solving CVRP.
Abstract: The capacitated vehicle routing problem (CVRP) is one of the most important problems in the optimization of distribution networks. The objective of CVRP, known demands on the cost of originating and terminating at a delivery depot, is to determine the optimal set of routes for a set of vehicles to deliver customers. CVRP is known to be NP-hard problem, and then it is difficult to solve this problem directly when the problem size is large. In this paper, a hybrid algorithm of simulated annealing and tabu search is applied to solve CVRP. It takes the advantages of simulated annealing and tabu search for solving CVRP. Simulation results are reported on classical fourteen instances and twenty large-scale benchmark instances. From simulation results, the proposed algorithm finds eight best solutions of classical fourteen instances. Additionally, the solutions of the proposed algorithm have also admirable performance for twenty large-scale benchmark instances. It shows that the proposed algorithm is competitive with other existing algorithms for solving CVRP.

Journal ArticleDOI
TL;DR: It is found that among the constructive algorithms the insertion-based approach is superior to the others, whereas the proposed SA algorithms are better than TS and genetic algorithms among the iterative metaheuristic algorithms.

Journal ArticleDOI
TL;DR: In this paper, an optimization methodology for the selection of best process parameters in electro-discharge machining is proposed, which simultaneously maximizes the material removal rate as well as minimizes the surface roughness using simulated annealing.

Journal ArticleDOI
TL;DR: In this paper, a hybrid evolutionary programming based clustering algorithm, called PSO-SA, was proposed by combining particle swarm optimization (PSO) and simulated annealing (SA), which increased the information exchange among particles using a mutation operator to escape local optima.
Abstract: The K-means algorithm is one of the most popular techniques in clustering. Nevertheless, the performance of the K-means algorithm depends highly on initial cluster centers and converges to local minima. This paper proposes a hybrid evolutionary programming based clustering algorithm, called PSO-SA, by combining particle swarm optimization (PSO) and simulated annealing (SA). The basic idea is to search around the global solution by SA and to increase the information exchange among particles using a mutation operator to escape local optima. Three datasets, Iris, Wisconsin Breast Cancer, and Ripley’s Glass, have been considered to show the effectiveness of the proposed clustering algorithm in providing optimal clusters. The simulation results show that the PSO-SA clustering algorithm not only has a better response but also converges more quickly than the K-means, PSO, and SA algorithms.

Journal ArticleDOI
TL;DR: In this article, a novel simulated annealing (SA) with a new concept, called migration mechanism, and a new operator, called giant leap, was introduced to bolster the competitive performance of SA through striking a compromise between the lengths of neighborhood search structures.
Abstract: This article addresses the problem of scheduling hybrid flowshops where the setup times are sequence dependent to minimize makespan and maximum tardiness. To solve such an NP-hard problem, we introduce a novel simulated annealing (SA) with a new concept, called “Migration mechanism”, and a new operator, called “Giant leap”, to bolster the competitive performance of SA through striking a compromise between the lengths of neighborhood search structures. We hybridize the SA (HSA) with a simple local search to further equip our algorithm with a new strong tool to promote the quality of final solution of our proposed SA. We employ the Taguchi method as an optimization technique to extensively tune different parameters and operators of our algorithm. Taguchi orthogonal array analysis is specifically used to pick the best parameters for the optimum design process with the least number of experiments. We established a benchmark to draw an analogy between the performance of SA with other algorithms. Two basically different objective functions, minimization of makespan and maximum tardiness, are taken into consideration to evaluate the robustness and effectiveness of the proposed HSA. Furthermore, we explore the effects of the increase in the number of jobs on the performance of our algorithm to make sure it is effective in terms of both the acceptability of the solution quality and robustness. The excellence and strength of our HSA are concluded from all the results acquired in various circumstances.

Proceedings ArticleDOI
13 Oct 2009
TL;DR: A heuristic search-based approach for automatically optimizing inter-package connectivity (i.e., dependencies) based on Simulated Annealing technique to help maintainers improve the quality of software modularization.
Abstract: Object-oriented (OO) software is usually organized into subsystems using the concepts of package or module.Such modular structure helps applications to evolve when facing new requirements.However, studies show that as software evolves to meet requirements and environment changes, modularization quality degrades. To help maintainers improve the quality of software modularization we have designed and implemented a heuristic search-based approach for automatically optimizing inter-package connectivity (i.e., dependencies).In this paper, we present our approach and its underlying techniques and algorithm.We show through a case study how it enables maintainers to optimize OO package structure of source code.Our optimization approach is based on Simulated Annealing technique.

Journal ArticleDOI
01 Jan 2009
TL;DR: Simulated Annealing, Simulated Quenching and Real-coded Genetic Algorithms can be utilized for efficient planning of any irrigation system with suitable modifications.
Abstract: The present study deals with the application of non-traditional optimization techniques, namely, Simulated Annealing (SA), Simulated Quenching (SQ) and Real-coded Genetic Algorithms (RGA) to a case study of Mahi Bajaj Sagar Project, India. The objective of the study is to maximize the annual net benefits subjected to various irrigation planning constraints for 75% dependable flow scenario. Extensive sensitivity analysis on various parameters used in above techniques indicated that they yielded same solution corresponding to a set of optimal combination of parameters. It is concluded that SA, SQ and RGA can be utilized for efficient planning of any irrigation system with suitable modifications.

Journal ArticleDOI
TL;DR: This paper illustrates how improvements in solution quality can be achieved by the hybridisation of the best-fit heuristic together with simulated annealing and the bottom-left-fill algorithm.
Abstract: The best-fit heuristic is a simple yet powerful one-pass approach for the two-dimensional rectangular stock-cutting problem. It had achieved the best published results on a wide range of benchmark problems until the development of the approaches described in this paper. Here, we illustrate how improvements in solution quality can be achieved by the hybridisation of the best-fit heuristic together with simulated annealing and the bottom-left-fill algorithm. We compare and contrast the new hybrid approach with other approaches from the literature in terms of execution times and the quality of the solutions achieved. Using a range of standard benchmark problems from the literature, we demonstrate how the new approach achieves significantly better results than previously published methods on almost all of the problem instances. In addition, we provide results on 10 new benchmark problems to encourage further research and greater comparison between current and future methods.

Journal ArticleDOI
TL;DR: The continuous network design problem has been studied using SA and GA on a simulated network and it is found that when demand is large, SA is more efficient than GA in solving CNDP, and much more computational effort is needed for GA to achieve the same optimal solution as SA.
Abstract: In general, a continuous network design problem (CNDP) is formulated as a bilevel program. The objective function at the upper level is defined as the total travel time on the network, plus total investment costs of link capacity expansions. The lower level problem is formulated as a certain traffic assignment model. It is well known that such bilevel program is nonconvex and algorithms for finding global optimal solutions are preferable to be used in solving it. Simulated annealing (SA) and genetic algorithm (GA) are two global methods and can then be used to determine the optimal solution of CNDP. Since the application of SA and GA on continuous network design on real transportation network requires solving traffic assignment model many times at each iteration of the algorithm, computation time needed is tremendous. It is important to compare the efficacy of the two methods and choose the more efficient one as reference method in practice. In this paper, the continuous network design problem has been studied using SA and GA on a simulated network. The lower level program is formulated as user equilibrium traffic assignment model and Frank-Wolf method is used to solve it. It is found that when demand is large, SA is more efficient than GA in solving CNDP, and much more computational effort is needed for GA to achieve the same optimal solution as SA. However, when demand is light, GA can reach a more optimal solution at the expense of more computation time. It is also found that increasing the iteration number at each temperature in SA does not necessarily improve solution. The finding in this example is different from [Karoonsoontawong, A., & Waller, S. T. (2006). Dynamic continuous network design problem - Linear bilevel programming and metaheuristic approaches. Transportation Research Record (1964), 104-117, Network Modeling 2006.]. The reason might be the bi-level model in this example is nonlinear while the bi-level model in their study is linear.

Journal ArticleDOI
TL;DR: In this paper, a unified representation model for integrated process planning and scheduling (IPPS) has been developed based on this model, a modern evolutionary algorithm, i.e., particle swarm optimisation (PSO) algorithm has been employed to optimise the IPPS problem.
Abstract: Traditionally, process planning and scheduling are two independent essential functions in a job shop manufacturing environment In this paper, a unified representation model for integrated process planning and scheduling (IPPS) has been developed Based on this model, a modern evolutionary algorithm, ie the particle swarm optimisation (PSO) algorithm has been employed to optimise the IPPS problem To explore the search space comprehensively, and to avoid being trapped into local optima, the PSO algorithm has been enhanced with new operators to improve its performance and different criteria, such as makespan, total job tardiness and balanced level of machine utilisation, have been used to evaluate the job performance To improve the flexibility and agility, a re-planning method has been developed to address the conditions of machine breakdown and new order arrival Case studies have been used to a verify the performance and efficiency of the modified PSO algorithm under different criteria A comparison has been made between the result of the modified PSO algorithm and those of the genetic algorithm (GA) and the simulated annealing (SA) algorithm respectively, and different characteristics of the three algorithms are indicated Case studies show that the developed PSO can generate satisfactory results in optimising the IPPS problem

Journal ArticleDOI
TL;DR: In this article, the authors present a near-optimal reduction from approximately counting the cardinality of a discrete set to approximately sampling elements of the set, which can be used to approximate the partition function Z of the Ising model, matchings or colorings of a graph.
Abstract: We present a near-optimal reduction from approximately counting the cardinality of a discrete set to approximately sampling elements of the set. An important application of our work is to approximating the partition function Z of a discrete system, such as the Ising model, matchings or colorings of a graph. The typical approach to estimating the partition function Z(βa) at some desired inverse temperature βa is to define a sequence, which we call a cooling schedule, β0 = 0