scispace - formally typeset
Search or ask a question

Showing papers on "Metaheuristic published in 1987"


Book
01 Oct 1987
TL;DR: The Nature and Organization of Optimization Problems are discussed in this article, where the authors develop models for optimisation problems and develop methods for optimization problems in the context of large scale plant design and operation.
Abstract: I Problem Formulation 1 The Nature and Organization of Optimization Problems 2 Developing Models for Optimization 3 Formulation of the Objective Function II Optimization Theory and Methods 4 Basic Concepts of Optimization 5 Optimization for Unconstrained Functions: One- Dimensional Search 6 Unconstrained Multivariable Optimization 7 Linear Programming and Applications 8 Nonlinear Programming with Constraints 9 Mixed-Integer Programming 10 Global Optimization for Problems Containing Continuous and Discrete Variables IIIApplications of Optimization 11 Heat Transfer and Energy Conservation 12 Separation Processes 13 Fluid Flow Systems 14 Chemical Reactor Design and Operation 15 Optimization in Large-Scale Plant Design and Operations 16 Integrated Planning, Scheduling, and Control in the Process Industries Appendixes

967 citations


Journal ArticleDOI
TL;DR: The application of a genetic algorithm to the steady state optimization of a serial liquid pipeline is considered and computer results show surprising speed as near-optimal results are obtained after examining a small fraction of the search space.
Abstract: The application of a genetic algorithm to the steady state optimization of a serial liquid pipeline is considered. Genetic algorithms are search procedures based upon the mechanics of natural genet...

264 citations


Journal ArticleDOI
TL;DR: The fitting of some of these techniques to continuous variables problems gave very promising results; that question is not discussed in detail in the paper, but useful references allowing to deepen the subject are given.
Abstract: We present a review of the main “global optimization" methods. The paper comprises one introduction and two parts. In the introduction, we recall some generalities about non linear constraint-less optimization and we list some classifications which have been proposed for the global optimization methods. We then describe, in the first part, various “classical" global optimization methods, most of which available long before the appearance of Simulated Annealing (a key event in this field). There exists plenty of papers and books dealing with these methods, and studying in particular their convergence properties. The second part of the paper is devoted to more recent or atypical methods, mostly issued from combinatorial optimization. The three main methods are “metaheuristics": Simulated Annealing (and derived techniques), Tabu Search and Genetic Algorithms; we also describe three other less known methods. For these methods, theoretical studies of convergence are less abundant in the literature, and the use of convergence results is by far more limited in practice. However, the fitting of some of these techniques to continuous variables problems gave very promising results; that question is not discussed in detail in the paper, but useful references allowing to deepen the subject are given.

249 citations


Journal ArticleDOI
TL;DR: Two new versions of the controlled random search procedure for global optimization (CRS) are described, intended to drive an optimizing accelerator, based on a concurrent processing architecture, which can be attached to a workstation to achieve a significant increase in speed.
Abstract: This paper describes two new versions of the controlled random search procedure for global optimization (CRS). Designed primarily to suit the user of a CAD workstation, these algorithms can also be used effectively in other contexts. The first, known as CRS3, speeds the final convergence of the optimization by combining a local optimization algorithm with the global search procedure. The second, called CCRS, is a concurrent version of CRS3. This algorithm is intended to drive an optimizing accelerator, based on a concurrent processing architecture, which can be attached to a workstation to achieve a significant increase in speed. The results are given of comparative trials which involve both unconstrained and constrained optimization.

159 citations




Book ChapterDOI
Emile H. L. Aarts1, Jan Korst1
15 Jun 1987
TL;DR: A formal model of the Boltzmann machine is presented and a discussion of two different applications of the model, viz. solving combinatorial optimization problems and carrying out learning tasks are discussed.
Abstract: In this paper we present a formal model of the Boltzmann machine and a discussion of two different applications of the model, viz. (i) solving combinatorial optimization problems and (ii) carrying out learning tasks. Numerical results of computer simulations are presented to demonstrate the characteristic features of the Boltzmann machine.

52 citations


Journal ArticleDOI
TL;DR: It is shown that a combination of both thermodynamic annealing and competition with selection of the fittest yields an effective simulation procedure for solving optimization problems.

44 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss some basic opportunities for the use of multiprocessing in the solution of optimization problems, including unconstrained optimization and global optimization, in the important case when function evaluation is expensive and gradients are evaluated by finite differences.
Abstract: This paper discusses some basic opportunities for the use of multiprocessing in the solution of optimization problems. We consider two fundamental optimization problems, unconstrained optimization and global optimization, in the important case when function evaluation is expensive and gradients are evaluated by finite differences. First we discuss some simple parallel strategies based upon the use of concurrent function evaluations to evaluate the finite difference gradient. These include the speculative evaluation of the gradient concurrently with the evaluation of the function before it is known whether the gradient value at this point will be required. We present examples that indicate the effectiveness of these parallel strategies for unconstrained optimization. We also give experimental results that show the effect of using these strategies to parallelize each of the multiple local minimizations within a recently proposed concurrent global optimization algorithm. We briefly discuss several parallel optimization strategies that are related to these approaches but make more fundamental changes to standard sequential optimization algorithms.

41 citations


01 Mar 1987
TL;DR: A hill climbing attachment called Iterated Descent useful in conjunction with any local search algorithm, including neural net algorithms, is proposed.
Abstract: We propose a hill climbing attachment called Iterated Descent useful in conjunction with any local search algorithm, including neural net algorithms.

32 citations


Journal ArticleDOI
TL;DR: Examining the neighborhood structures of two classes of problems, 0–1 integer programming and the mean tardiness job sequencing problem—from the viewpoint of state-space graphs in artificial intelligence—is examined.

Proceedings ArticleDOI
01 Dec 1987
TL;DR: A strategy for comparing optimization techniques for Monte-Carlo simulations, and several interesting findings for quasi-Newton methods, simplex search, and others are presented.
Abstract: There is increasing interest in science and industry in the optimization of computer simulation models. Often these models are not Monte-Carlo simulations, but consist of systems of differential equations, or other mathematical models. These models can present special problems to numerical optimization methods. First, derivatives are often unavailable. Second, function evaluations can be extremely expensive (e.g. 1 hour on an IBM 3090). Third, the numerical accuracy of each function value may depend on a complicated chain of calculations, and so be impractical to pre-specify. This last point makes it difficult to calibrate optimization routines that use finite difference approximations for gradients. This paper presents a strategy for comparing optimization techniques for these problems, and reviews several interesting findings for quasi-Newton methods, simplex search, and others.

Book ChapterDOI
01 Jan 1987
TL;DR: The investigation of the structure of the parameter set leads to a necessary and sufficient criterion for the non-emptiness of the set of efficient points and the results are applied to a scalarization of multi-objective optimization problems.
Abstract: In this paper, efficient and weakly efficient points of a set are characterized by an optimization problem with a parameter in the bottleneck objective function. The investigation of the structure of the parameter set leads to a necessary and sufficient criterion for the non-emptiness of the set of efficient points. The continuous dependence of the optimization problem on the parameter is investigated. Finally, the results are applied to a scalarization of multi-objective optimization problems.

Journal ArticleDOI
TL;DR: An auxiliary system of nonlinear equations is constructed, utilizing the special structure of F(z)=0 an efficient algorithm for solving the optimization problem, is proposed, and assertions about the convergence of the generated sequences are shown.
Abstract: The paper is concerned with computing a solution ∗of the nonlinear optimization problem subject to fixed. An auxiliary system of nonlinear equations is constructed the solution z∗of which contains ∗Utilizing the special structure of F(z)=0 an efficient algorithm for solving the optimization problem, is proposed, and assertions about the convergence of the generated sequences are shown.

Journal ArticleDOI
01 Mar 1987
TL;DR: An optimization strategy is presented that provides a frame-work in which optimization algorithms and heuristic procedures can be coupled to solve nonlinearly constrained design optimization problems.
Abstract: An optimization strategy is presented that provides a frame-work in which optimization algorithms and heuristic procedures can be coupled to solve nonlinearly constrained design optimization problems These problems cannot be efficiently solved by either approach independently The approach is based on an optimization algorithm dealing with local monotonicity and sequential quadratic programming techniques with heuristic procedures which are statistically derived from observations obtained by applying the optimization algorithm to different classes of test problems



Journal ArticleDOI
TL;DR: An approach to optimization which uses performance criteria from tests conducted on a large variety of different classes of nonlinearly constrained design problems, to improve the design optimization process.
Abstract: Presented is an approach to optimization which uses performance criteria from tests conducted on a large variety of different classes of nonlinearly constrained design problems, to improve the design optimization process. The knowledge gathering methods and evaluation techniques are discussed along with how heuristic procedures are integrated to enhance the program's optimization capabilities.

Journal ArticleDOI
TL;DR: The methods described in the paper exploit the specific problem structure for the reduction of an optimization problem to a sequence of related but simpler and more easily solvable optimization problems.