scispace - formally typeset
Search or ask a question

Showing papers on "Metaheuristic published in 1978"



Journal ArticleDOI
TL;DR: In the general framework of inifinite-dimensional convex programming, two fundamental principles are demonstrated and used to derive several basic algorithms to solve a so-called "master" (constrained optimization) problem.
Abstract: In the general framework of inifinite-dimensional convex programming, two fundamental principles are demonstrated and used to derive several basic algorithms to solve a so-called "master" (constrained optimization) problem. These algorithms consist in solving an infinite sequence of "auxiliary" problems whose solutions converge to the master's optimal one. By making particular choices for the auxiliary problems, one can recover either classical algorithms (gradient, Newton-Raphson, Uzawa) or decomposition-coordination (two-level) algorithms. The advantages of the theory are that it clearly sets the connection between classical and two-level algorithms, It provides a framework for classifying the two-level algorithms, and it gives a systematic way of deriving new algorithms.

186 citations


Journal ArticleDOI
01 Jan 1978
TL;DR: In this paper, the authors report new methods for obtaining a noninferior solution to a multicriterion optimization problem without employing an artificial single objective, and they also propose interactive methods for locating a non-inference solution that will be satisfactory to the decision maker.
Abstract: In this paper we report new methods for obtaining a noninferior solution to a multicriterion optimization problem. Unlike existing methods, which convert vector-valued optimization problems into scalar-valued ones, our methods treat the multiple objectives as they are, without employing an artificial single objective. Each of the new methods is readily implementable and convergent to a local noninferior solution under suitable assumptions. We also propose interactive methods for locating a noninferior solution that will be satisfactory to the decision maker.

42 citations


Journal ArticleDOI
TL;DR: In this article, a technique for the calculation of Pareto-optimal solutions to a multipleobjective constrained optimization problem by solving a series of single-objective problems is presented.
Abstract: A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

23 citations


Journal ArticleDOI
TL;DR: This paper presents an efficient implementation scheme for optimization algorithms in the family of gradient projection, reduced gradient, and gradient restoration methods.
Abstract: This paper presents an efficient implementation scheme for optimization algorithms in the family of gradient projection, reduced gradient, and gradient restoration methods.

20 citations


Journal ArticleDOI
20 Apr 1978

2 citations



Dissertation
01 Jan 1978
TL;DR: This thesis was produced using the GATE text editing system on the Computati: bnal and.
Abstract: F II / Best Copy Available ACKNOWLEDGEMENTS This thesis owes its birth to Austin Tate, its growth to Robert Ross, and its completion to Brian Boffey. Particular thanks are due to them. I am grateful for access to computing services at Edinburgh University and the Edinburgh Regional Computing Centre. Further computing facilities were provided by the Liverpool _-University Computer Laboratory. This thesis was produced using the GATE text editing system on the Computati: bnal and. Statistical Science departmental computer system at Liverpool University. Many of the diagrams and tables were prepared by Miss«, M. Ross. Other secretarial services were provided by Liverpool University. CONTENTS Chapter 1 Introduction 1 1.1 Motivation and structure 1 1.2 The 1#-dimensional trim-loss problem 3 1.3 The 2-dimensional trim-loss problem 4

2 citations


Book ChapterDOI
01 Jan 1978
TL;DR: In particular, the authors showed that for nonlinear programming problems, the average number of iterations required to reach an optimum varies widely from one problem to another, and that the average time complexity is of the order of 2m, much less than might be expected from the number of vertices of the constraint set.
Abstract: Many algorithms have been proposed for computing the optimum (or optima) of a mathematical programming problem, but there is no universal method. The simplex method for linear programming (3.3) is a highly efficient algorithm; while the number of iterations required to reach an optimum varies widely from one problem to another, the average number of iterations, for problems with constraints Ax — b ∈ ∝m/+ with A an m x n matrix with n much greater than m, is of the order of 2m, much less than might be expected from the number of vertices of the constraint set. (This remark does not apply to programming restricted to integer values.) If a nonlinear programming problem can be arranged so as to be solvable by a modified simplex method, this is commonly the most efficient procedure. In particular, a problem which allows an adequate approximation by piece- wise linear functions, of not too many variables, may be computed as a separable programming problem (3.4). Also a problem with a quadratic objective and linear constraints can be solved by a modified simplex method (4.6 and 7.6).

1 citations


Journal ArticleDOI
H. Mosteller1
TL;DR: A new heuristic direct-search minimization algorithm is presented that consists of global and local search strategies and shows improved performance versus other direct- search algorithms.
Abstract: A new heuristic direct-search minimization algorithm is presented. The algorithm consists of global and local search strategies. Improved performance versus other direct-search algorithms is shown by a performance comparison on standard test functions.