scispace - formally typeset
Search or ask a question

Showing papers on "Simulated annealing published in 2007"


Journal ArticleDOI
TL;DR: This work presents a new iterated greedy algorithm that applies two phases iteratively, named destruction, were some jobs are eliminated from the incumbent solution, and construction, where the eliminated jobs are reinserted into the sequence using the well known NEH construction heuristic.

923 citations


Journal ArticleDOI
TL;DR: This work solves a problem of non-convex stochastic optimisation with help of simulated annealing of Levy flights of a variable stability index by solving the problem of search of the ground state of an unknown potential with non-local consequences.

590 citations


Proceedings ArticleDOI
17 Jun 2007
TL;DR: An efficient implementation of the "probing" technique is discussed, which simplifies the MRF while preserving the global optimum, and a new technique which takes an arbitrary input labeling and tries to improve its energy is presented.
Abstract: Many computer vision applications rely on the efficient optimization of challenging, so-called non-submodular, binary pairwise MRFs. A promising graph cut based approach for optimizing such MRFs known as "roof duality" was recently introduced into computer vision. We study two methods which extend this approach. First, we discuss an efficient implementation of the "probing" technique introduced recently by Bows et al. (2006). It simplifies the MRF while preserving the global optimum. Our code is 400-700 faster on some graphs than the implementation of the work of Bows et al. (2006). Second, we present a new technique which takes an arbitrary input labeling and tries to improve its energy. We give theoretical characterizations of local minima of this procedure. We applied both techniques to many applications, including image segmentation, new view synthesis, super-resolution, diagram recognition, parameter learning, texture restoration, and image deconvolution. For several applications we see that we are able to find the global minimum very efficiently, and considerably outperform the original roof duality approach. In comparison to existing techniques, such as graph cut, TRW, BP, ICM, and simulated annealing, we nearly always find a lower energy.

518 citations


Journal ArticleDOI
01 Feb 2007
TL;DR: This paper proposes an effective particle swarm optimization (PSO)-based memetic algorithm (MA) for the permutation flow shop scheduling problem (PFSSP) with the objective to minimize the maximum completion time, which is a typical non-deterministic polynomial-time (NP) hard combinatorial optimization problem.
Abstract: This paper proposes an effective particle swarm optimization (PSO)-based memetic algorithm (MA) for the permutation flow shop scheduling problem (PFSSP) with the objective to minimize the maximum completion time, which is a typical non-deterministic polynomial-time (NP) hard combinatorial optimization problem. In the proposed PSO-based MA (PSOMA), both PSO-based searching operators and some special local searching operators are designed to balance the exploration and exploitation abilities. In particular, the PSOMA applies the evolutionary searching mechanism of PSO, which is characterized by individual improvement, population cooperation, and competition to effectively perform exploration. On the other hand, the PSOMA utilizes several adaptive local searches to perform exploitation. First, to make PSO suitable for solving PFSSP, a ranked-order value rule based on random key representation is presented to convert the continuous position values of particles to job permutations. Second, to generate an initial swarm with certain quality and diversity, the famous Nawaz-Enscore-Ham (NEH) heuristic is incorporated into the initialization of population. Third, to balance the exploration and exploitation abilities, after the standard PSO-based searching operation, a new local search technique named NEH_1 insertion is probabilistically applied to some good particles selected by using a roulette wheel mechanism with a specified probability. Fourth, to enrich the searching behaviors and to avoid premature convergence, a simulated annealing (SA)-based local search with multiple different neighborhoods is designed and incorporated into the PSOMA. Meanwhile, an effective adaptive meta-Lamarckian learning strategy is employed to decide which neighborhood to be used in SA-based local search. Finally, to further enhance the exploitation ability, a pairwise-based local search is applied after the SA-based search. Simulation results based on benchmarks demonstrate the effectiveness of the PSOMA. Additionally, the effects of some parameters on optimization performances are also discussed

451 citations


Journal ArticleDOI
TL;DR: A mathematical model and heuristic approaches for flexible job shop scheduling problems (FJSP) are considered and it is concluded that the hierarchical algorithms have better performance than integrated algorithms and the algorithm which use tabu search and simulated annealing heuristics for assignment and sequencing problems consecutively is more suitable than the other algorithms.
Abstract: Scheduling for the flexible job shop is very important in both fields of production management and combinatorial optimization. However, it is quite difficult to achieve an optimal solution to this problem in medium and actual size problem with traditional optimization approaches owing to the high computational complexity. For solving the realistic case with more than two jobs, two types of approaches have been used: hierarchical approaches and integrated approaches. In hierarchical approaches assignment of operations to machines and the sequencing of operations on the resources or machines are treated separately, i.e., assignment and sequencing are considered independently, where in integrated approaches, assignment and sequencing are not differentiated. In this paper, a mathematical model and heuristic approaches for flexible job shop scheduling problems (FJSP) are considered. Mathematical model is used to achieve optimal solution for small size problems. Since FJSP is NP-hard problem, two heuristics approaches involve of integrated and hierarchical approaches are developed to solve the real size problems. Six different hybrid searching structures depending on used searching approach and heuristics are presented in this paper. Numerical experiments are used to evaluate the performance of the developed algorithms. It is concluded that, the hierarchical algorithms have better performance than integrated algorithms and the algorithm which use tabu search and simulated annealing heuristics for assignment and sequencing problems consecutively is more suitable than the other algorithms. Also the numerical experiments validate the quality of the proposed algorithms.

318 citations


Journal ArticleDOI
TL;DR: The various meta-heuristics that have been specifically developed to solve lot sizing problems are reviewed, discussing their main components such as representation, evaluation neighborhood definition and genetic operators.

258 citations


Journal ArticleDOI
TL;DR: A general geometry-based algorithm is developed that efficiently samples conformational space under constraints imposed by low-resolution density maps obtained from electron microscopy or X-ray crystallography experiments and is robust even for noise-added density maps.

237 citations


Journal ArticleDOI
TL;DR: A unified representation model and a simulated annealing-based approach have been developed to facilitate the integration and optimization process to achieve the global optimization of product development and manufacturing.
Abstract: A job shop needs to deal with a lot of make-to-order business, in which the orders are usually diverse in types but each one is small in volume. To increase the flexibility and responsiveness of the job shop in the more competitive market, process planning and scheduling modules have been actively developed and deployed. The functions of the two modules are usually complementary. It is ideal to integrate them more tightly to achieve the global optimization of product development and manufacturing. In this paper, a unified representation model and a simulated annealing-based approach have been developed to facilitate the integration and optimization process. In the approach, three strategies, including processing flexibility, operation sequencing flexibility and scheduling flexibility, have been used for exploring the search space to support the optimization process effectively. Performance criteria, such as makespan, the balanced level of machine utilization, job tardiness and manufacturing cost, have been systematically defined to make the algorithm adaptive to meet various practical requirements. Case studies under various working conditions and the comparisons of this approach with two modern evolutionary approaches are given. The merits and characteristics of the approach are thereby highlighted.

203 citations


Journal ArticleDOI
TL;DR: This paper attempts to hybridize this concept with other meta-heuristic concepts such as genetic algorithm, simulated annealing, and tabu search to solve the network design problem in transportation.

182 citations


Journal ArticleDOI
TL;DR: It is described how the heuristic optimization technique simulated annealing (SA) can be effectively used for estimating the parameters of S-systems from time-course biochemical data.
Abstract: Motivation: High-throughput technologies now allow the acquisition of biological data, such as comprehensive biochemical time-courses at unprecedented rates. These temporal profiles carry topological and kinetic information regarding the biochemical network from which they were drawn. Retrieving this information will require systematic application of both experimental and computational methods. Results: S-systems are non-linear mathematical approximative models based on the power-law formalism. They provide a general framework for the simulation of integrated biological systems exhibiting complex dynamics, such as genetic circuits, signal transduction and metabolic networks. We describe how the heuristic optimization technique simulated annealing (SA) can be effectively used for estimating the parameters of S-systems from time-course biochemical data. We demonstrate our methods using three artificial networks designed to simulate different network topologies and behavior. We then end with an application to a real biochemical network by creating a working model for the cadBA system in Escherichia coli. Availability: The source code written in C++ is available at http://www.engg.upd.edu.ph/~naval/bioinformcode.html. All the necessary programs including the required compiler are described in a document archived with the source code. Contact: gonzalez@bio.ifi.lmu.de Supplementary information: Supplementary material is available at Bioinformatics online.

173 citations


Journal ArticleDOI
TL;DR: A mathematical model is provided to formulate the problem and a simulated annealing algorithm is developed to solve the proposed model and Numerical experiments are conducted to test the performance of the proposed SA algorithm.

Journal Article
TL;DR: The experimental results reveal that the SVM model with simulated annealing algorithms (SVMSA) results in better predictions than the other methods, and the proposed model is a valid and promising alternative for forecasting software reliability.
Abstract: Support vector machines (SVMs) have been successfully employed to solve non-linear regression and time series problems. However, SVMs have rarely been applied to forecasting software reliability. This investigation elucidates the feasibility of the use of SVMs to forecast software reliability. Simulated annealing algorithms (SA) are used to select the parameters of an SVM model. Numerical examples taken from the existing literature are used to demonstrate the performance of software reliability forecasting. The experimental results reveal that the SVM model with simulated annealing algorithms (SVMSA) results in better predictions than the other methods. Hence, the proposed model is a valid and promising alternative for forecasting software reliability.

Journal ArticleDOI
TL;DR: A new algorithm is established that efficient, greedy, one‐test‐at‐a‐time methods can indeed produce a logarithmic worst‐case guarantee on the test suite size and can be done while still producing test suites that are of competitive size, and in a time that is comparable to the published methods.
Abstract: SUMMARY There are many published algorithms for generating interaction test suites for software testing, exemplified by AETG, IPO, TCG, TConfig, simulated annealing and other heuristic search, and combinatorial design techniques. Among these, greedy one-test-at-a-time methods (such as AETG and TCG) have proven to be a reasonable compromise between the needs for small test suites, fast test-suite generation, and flexibility to accommodate a variety of testing scenarios. However, such methods suffer from the lack of a worst-case logarithmic guarantee on test suite size, while methods that provide such a guarantee at present are less efficient or flexible, or do not produce test suites that are competitive in size for practical testing scenarios. In this paper, a new algorithm establishes that efficient, greedy, one-test-at-a-time methods can indeed produce a logarithmic worst-case guarantee on the test suite size. In addition, this can be done while still producing test suites that are of competitive size, and in a time that is comparable to the publishedmethods. It is deterministic, guaranteeing reproducibility. It generates only one candidate test at a time, permits users to ‘seed’ the test suite with specified tests, and allows users to specify constraints of combinations that should be avoided. Further, statistical analysis examines the impact of five variables used to tune this density algorithm for execution time and test suite size: weighting of density for factors, scaling of density, tie-breaking, use of multiple candidates, and multiple repetitions using randomization. Copyright c

Journal ArticleDOI
TL;DR: It is found that no single method can outperform all the other methods under all cases, as different method has different behavior in different types of problems.

Journal ArticleDOI
TL;DR: This is the first time the PSO technique has been used to perform global optimization of minimum structure search for chemical systems and successfully found the lowest‐energy structures of the LJ26 Lennard‐Jones cluster, anionic silicon hydride Si2H 5− , and triply hydrated hydroxide ion OH− (H2O)3.
Abstract: Novel implementation of the evolutionary approach known as particle swarm optimization (PSO) capable of finding the global minimum of the potential energy surface of atomic assemblies is reported. This is the first time the PSO technique has been used to perform global optimization of minimum structure search for chemical systems. Significant improvements have been introduced to the original PSO algorithm to increase its efficiency and reliability and adapt it to chemical systems. The developed software has successfully found the lowest-energy structures of the LJ26 Lennard-Jones cluster, anionic silicon hydride Si2H, and triply hydrated hydroxide ion OH− (H2O)3. It requires relatively small population sizes and demonstrates fast convergence. Efficiency of PSO has been compared with simulated annealing, and the gradient embedded genetic algorithm. © 2007 Wiley Periodicals, Inc. J Comput Chem, 2007

Journal ArticleDOI
TL;DR: Four such modifications, all based on properties of the physical problem, are introduced and incorporated into a hyperheuristic driven simulated annealing solution approach.

Journal ArticleDOI
TL;DR: A variant of simulated annealing incorporating a variable penalty method to solve the traveling-salesman problem with time windows (TSPTW) compares favorably with benchmark results in the literature, obtaining best known results for numerous instances.
Abstract: This paper describes a variant of simulated annealing incorporating a variable penalty method to solve the traveling-salesman problem with time windows (TSPTW). Augmenting temperature from traditional simulated annealing with the concept of pressure (analogous to the value of the penalty multiplier), compressed annealing relaxes the time-window constraints by integrating a penalty method within a stochastic search procedure. Computational results validate the value of a variable-penalty method versus a static-penalty approach. Compressed annealing compares favorably with benchmark results in the literature, obtaining best known results for numerous instances.

Journal ArticleDOI
TL;DR: This first application of a metaheuristic technique to the very popular and NP-complete puzzle known as ‘sudoku’ is presented and it is seen that this stochastic search-based algorithm is able to complete logic-solvable puzzle-instances that feature daily in many of the UK's national newspapers.
Abstract: In this paper we present, to our knowledge, the first application of a metaheuristic technique to the very popular and NP-complete puzzle known as `sudoku' We see that this stochastic search-based algorithm, which uses simulated annealing, is able to complete logic-solvable puzzle-instances that feature daily in many of the UK's national newspapers We also introduce a new method for producing sudoku problem instances (that are not necessarily logic-solvable) and use this together with the proposed SA algorithm to try and discover for what types of instances this algorithm is best suited Consequently we notice the presence of an `easy-hard-easy' style phase-transition similar to other problems encountered in operational research

Journal ArticleDOI
TL;DR: By applying the derived upper bound for the number of hubs the proposed heuristic is capable of obtaining optimal solutions for all small-scaled problems very efficiently and outperforms a genetic algorithm and a simulated annealing method in solving USAHLP.
Abstract: The uncapacitated single allocation hub location problem (USAHLP), with the hub-and-spoke network structure, is a decision problem in regard to the number of hubs and location–allocation. In a pure hub-and-spoke network, all hubs, which act as switching points for internodal flows, are interconnected and none of the non-hubs (i.e., spokes) are directly connected. The key factors for designing a successful hub-and-spoke network are to determine the optimal number of hubs, to properly locate hubs, and to allocate the non-hubs to the hubs. In this paper two approaches to determine the upper bound for the number of hubs along with a hybrid heuristic based on the simulated annealing method, tabu list, and improvement procedures are proposed to resolve the USAHLP. Computational experiences indicate that by applying the derived upper bound for the number of hubs the proposed heuristic is capable of obtaining optimal solutions for all small-scaled problems very efficiently. Computational results also demonstrate that the proposed hybrid heuristic outperforms a genetic algorithm and a simulated annealing method in solving USAHLP.

Journal ArticleDOI
TL;DR: Results obtained are promising and show that the hybrid approaches are less sensitive to the variations of technique parameters and offer an effective alternative for solving the generator maintenance scheduling problem.

Journal ArticleDOI
Yang Yu1, Yu Xinjie1
TL;DR: Comparisons with another genetic algorithm-based digital IIR filter design method by numerical experiments show that the suggested algorithm is effective and robust in digital IIr filter design.
Abstract: A novel algorithm for digital infinite-impulse response (IIR) filter design is proposed in this paper. The suggested algorithm is a kind of cooperative coevolutionary genetic algorithm. It considers the magnitude response and the phase response simultaneously and also tries to find the lowest filter order. The structure and the coefficients of the digital IIR filter are coded separately, and they evolve coordinately as two different species, i.e., the control species and the coefficient species. The nondominated sorting genetic algorithm-II is used for the control species to guide the algorithms toward three objectives simultaneously. The simulated annealing is used for the coefficient species to keep the diversity. These two strategies make the cooperative coevolutionary process work effectively. Comparisons with another genetic algorithm-based digital IIR filter design method by numerical experiments show that the suggested algorithm is effective and robust in digital IIR filter design

Journal ArticleDOI
TL;DR: A multi-objective simulated annealing approach is proposed to tackle a production scheduling problem in a flexible job-shop with particular constraints: batch production; existence of two steps: production of several sub-products followed by the assembly of the final product.

Journal ArticleDOI
TL;DR: This paper considers minimizing the total weighted earliness and tardiness with a restrictive common due date in a single machine environment, which has been proved as an NP-hard problem.

Journal ArticleDOI
TL;DR: An effective hybrid algorithm based on particle swarm optimization (PSO) for no-wait flow shop scheduling with the criterion to minimize the maximum completion time (makespan) is proposed.
Abstract: The no-wait flow shop scheduling that requires jobs to be processed without interruption between consecutive machines is a typical NP-hard combinatorial optimization problem, and represents an important area in production scheduling. This paper proposes an effective hybrid algorithm based on particle swarm optimization (PSO) for no-wait flow shop scheduling with the criterion to minimize the maximum completion time (makespan). In the algorithm, a novel encoding scheme based on random key representation is developed, and an efficient population initialization, an effective local search based on the Nawaz-Enscore-Ham (NEH) heuristic, as well as a local search based on simulated annealing (SA) with an adaptive meta-Lamarckian learning strategy are proposed and incorporated into PSO. Simulation results based on well-known benchmarks and comparisons with some existing algorithms demonstrate the effectiveness of the proposed hybrid algorithm.

Journal ArticleDOI
TL;DR: It is shown that the problem of schedule construction on the base of a given operation processing order can be reduced to the linear programming task.

Journal ArticleDOI
TL;DR: A new approach is presented for the detection and inference of irregularly shaped spatial clusters, using a genetic algorithm that is an order of magnitude faster and exhibits less variance compared to the simulated annealing scan, and is more flexible than the elliptic scan.

Journal ArticleDOI
TL;DR: The Multiple Server location problem is introduced and it turns out to be a very difficult combinatorial problem when the total demand is very close to the total capacity of the servers.
Abstract: In this paper, we introduce the Multiple Server location problem. A given number of servers are to be located at nodes of a network. Demand for these servers is generated at each node, and a subset of nodes need to be selected for locating one or more servers in each. There is no limit on the number of servers that can be established at each node. Each customer at a node selects the closest server (with demand divided equally when the closest distance is measured to more than one node). The objective is to minimize the sum of the travel time and the average time spent at the server, for all customers. The problem is formulated and analysed. Results using heuristic solution procedures: descent, simulated annealing, tabu search and a genetic algorithm are reported. The problem turns out to be a very difficult combinatorial problem when the total demand is very close to the total capacity of the servers.

Journal ArticleDOI
TL;DR: A knowledge-informed Pareto simulated annealing approach is developed and demonstrated to tackle specifically multi-objective spatial allocation problems that consider spatial patterns as objectives and the results suggest that the solutions generated are more effective in approximating the set of Pare to optimal solutions than those generated by the standard PareTo simulated annesaling.

Journal ArticleDOI
TL;DR: Turbence in the Particle Swarm Optimisation (TPSO) algorithm is introduced to overcome the problem of stagnation and empirical results illustrate that the FATPSO could prevent premature convergence very effectively and it clearly outperforms SPSO and GA.
Abstract: Particle Swarm Optimisation (PSO) algorithm is a stochastic search technique, which has exhibited good performance across a wide range of applications. However, very often for multimodal problems involving high dimensions, the algorithm tends to suffer from premature convergence. Analysis of the behaviour of the particle swarm model reveals that such premature convergence is mainly due to the decrease of velocity of particles in the search space that leads to a total implosion and ultimately fitness stagnation of the swarm. This paper introduces Turbulence in the Particle Swarm Optimisation (TPSO) algorithm to overcome the problem of stagnation. The algorithm uses a minimum velocity threshold to control the velocity of particles. The parameter, minimum velocity threshold of the particles is tuned adaptively by a fuzzy logic controller embedded in the TPSO algorithm, which is further called as Fuzzy Adaptive TPSO (FATPSO). We evaluated the performance of FATPSO and compared it with the Standard PSO (SPSO), Genetic Algorithm (GA) and Simulated Annealing (SA). The comparison was performed on a suite of 10 widely used benchmark problems for 30 and 100 dimensions. Empirical results illustrate that the FATPSO could prevent premature convergence very effectively and it clearly outperforms SPSO and GA.

Journal ArticleDOI
TL;DR: A new heuristic algorithm is recommended based on two important concepts, namely, the corner-occupying action and caving degree, which is fairly efficient for solving the rectangle packing problem.