scispace - formally typeset
Search or ask a question
Journal ArticleDOI

How bad are the BFGS and DFP methods when the objective function is quadratic

01 Jan 1986-Mathematical Programming (Springer-Verlag New York, Inc.)-Vol. 34, Iss: 1, pp 34-47
TL;DR: The results help to explain why the DFP method is often less suitable than the BFGS algorithm for general unconstrained optimization calculations, and they show that quadratic functions provide much information about efficiency when the current vector of variables is too far from the solution for an asymptotic convergence analysis.
Abstract: We study the use of the BFGS and DFP algorithms with step-lengths of one for minimizing quadratic functions of only two variables. The updating formulae in this case imply nonlinear three term recurrence relations between the eigenvalues of consecutive second derivative approximations, which are analysed in order to explain some gross inefficiencies that can occur. Specifically, the BFGS algorithm may require more than 10 iterations to achieve the first decimal place of accuracy, while the performance of the DFP method is far worse. The results help to explain why the DFP method is often less suitable than the BFGS algorithm for general unconstrained optimization calculations, and they show that quadratic functions provide much information about efficiency when the current vector of variables is too far from the solution for an asymptotic convergence analysis.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper attempts to give an overview of deformable registration methods, putting emphasis on the most recent advances in the domain, and provides an extensive account of registration techniques in a systematic manner.
Abstract: Deformable image registration is a fundamental task in medical image processing. Among its most important applications, one may cite: 1) multi-modality fusion, where information acquired by different imaging devices or protocols is fused to facilitate diagnosis and treatment planning; 2) longitudinal studies, where temporal structural or anatomical changes are investigated; and 3) population modeling and statistical atlases used to study normal anatomical variability. In this paper, we attempt to give an overview of deformable registration methods, putting emphasis on the most recent advances in the domain. Additional emphasis has been given to techniques applied to medical images. In order to study image registration methods in depth, their main components are identified and studied independently. The most recent techniques are presented in a systematic fashion. The contribution of this paper is to provide an extensive account of registration techniques in a systematic manner.

1,434 citations


Cites methods from "How bad are the BFGS and DFP method..."

  • ...BFGS is considered to be more efficient than DFP [410], [411]....

    [...]

Journal ArticleDOI
TL;DR: To make the task more tractable, I decided to consider only algorithms for unconstrained optimization, and select the best optimization methods known to date – those methods that deserve to be in a subroutine library.
Abstract: A few months ago, while preparing a lecture to an audience that included engineers and numerical analysts, I asked myself the question: from the point of view of a user of nonlinear optimization routines, how interesting and practical is the body of theoretical analysis developed in this field? To make the question a bit more precise, I decided to select the best optimization methods known to date – those methods that deserve to be in a subroutine library – and for each method ask: what do we know about the behaviour of this method, as implemented in practice? To make my task more tractable, I decided to consider only algorithms for unconstrained optimization.

327 citations


Cites background or methods or result from "How bad are the BFGS and DFP method..."

  • ...A. Griewank and Ph.L. Toint (1982a), 'On the unconstrained optimization of partially separable objective functions', in Nonlinear Optimization 1981 (M.J.D. Powell, ed.)...

    [...]

  • ...Therefore it is common to restrict these studies to convex problems (Nemirovsky and Yudin, 1983), or even to strictly convex quadratic objective functions (Powell, 1986)....

    [...]

  • ...M.J.D. Powell (1971), 'On the convergence of the variable metric algorithm', J. Inst. Math....

    [...]

  • ...Since Powell's example requires that some consecutive search directions become almost contrary, and since this can only be achieved (in the case of exact line searches) when k < 0, (Powell, 1986) suggests modifying the Polak-Ribi ere method by setting k = maxf PR k ; 0g: (4:8) Thus if a negative value of PR k occurs, this strategy will restart the iteration along the steepest descent direction....

    [...]

  • ...M.J.D. Powell (1986), 'How bad are the BFGS and DFP methods when the objective function is quadratic?...

    [...]

Journal ArticleDOI
TL;DR: The authors describe des resultats d'une serie de tests for une classe de nouvelles methodes du type region de confiance, i.e., region of confiance.
Abstract: Description des resultats d'une serie de tests pour une classe de nouvelles methodes du type region de confiance

214 citations


Cites background from "How bad are the BFGS and DFP method..."

  • ...) method(s) perform rather poorly but the rank-one formula would prove highly efficient (see Powell [24])....

    [...]

Journal ArticleDOI
TL;DR: This paper reviews some of the most successful methods for unconstrained, constrained and nondifferentiable optimization calculations and suggests that practical considerations provide the main new ideas, and that subsequent theoretical studies give improvements to algorithms, coherence to the subject, and better understanding.
Abstract: This paper reviews some of the most successful methods for unconstrained, constrained and nondifferentiable optimization calculations. Particular attention is given to the contribution that theoretical analysis has made to the development of algorithms. It seems that practical considerations provide the main new ideas, and that subsequent theoretical studies give improvements to algorithms, coherence to the subject, and better understanding.

191 citations

References
More filters
Book
01 Jan 2009
TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Abstract: Preface Table of Notation Part 1: Unconstrained Optimization Introduction Structure of Methods Newton-like Methods Conjugate Direction Methods Restricted Step Methods Sums of Squares and Nonlinear Equations Part 2: Constrained Optimization Introduction Linear Programming The Theory of Constrained Optimization Quadratic Programming General Linearly Constrained Optimization Nonlinear Programming Other Optimization Problems Non-Smooth Optimization References Subject Index.

7,278 citations


"How bad are the BFGS and DFP method..." refers background or result in this paper

  • ...creates new difficulties, and many published results, like the ones in [ 2 ], show that it is often very suitable to set ak = 1 on most iterations of a BFGS algorithm....

    [...]

  • ...Similar tendencies have been noted already for more elaborate objective functions in several publications, for example see page 56 of [ 2 ], but the advantage of the function (1. l) is that it is straightforward to explain the numerical results theoretically....

    [...]