Conditioning of Quasi-Newton Methods for Function Minimization
Reads0
Chats0
TLDR
In this paper, a class of approximating matrices as a function of a scalar parameter is presented, where the problem of optimal conditioning of these matrices under an appropriate norm is investigated and a set of computational results verifies the superiority of the new methods arising from conditioning considerations to known methods.Abstract:
Quasi-Newton methods accelerate the steepest-descent technique for function minimization by using computational history to generate a sequence of approximations to the inverse of the Hessian matrix. This paper presents a class of approximating matrices as a function of a scalar parameter. The problem of optimal conditioning of these matrices under an appropriate norm as a function of the scalar parameter is investigated. A set of computational results verifies the superiority of the new methods arising from conditioning considerations to known methods.read more
Citations
More filters
Journal ArticleDOI
Transition state structures and reaction profiles from constrained optimization procedure. Implementation in the framework of density functional theory
Yuri G. Abashkin,Nino Russo +1 more
TL;DR: A method for finding the transition structures (TS) based on constrained optimization techniques is proposed, which can be considered as a step‐by‐step walking uphill process along the minimum energy path, followed by a refining procedure of TS parameters in the saddle point vicinity.
Journal ArticleDOI
Quasi-Newton's method for multiobjective optimization
TL;DR: A quasi-Newton's method for unconstrained multiobjective optimization of strongly convex objective functions is presented and a new algorithm is proposed and it is proved that its convergence is superlinear.
Proceedings ArticleDOI
Aerodynamic design optimization and shape exploration using generative adversarial networks
Wei Chen,Kevin Chiu,Mark Fuge +2 more
TL;DR: This work proposes to use a deep generative model of aerodynamic designs (specifically airfoils) that reduces the dimensionality of the optimization problem by learning from shape variations in the UIUC airfoil database and shows that this model empirically accelerates optimization convergence by over an order of magnitude.
Journal ArticleDOI
Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
TL;DR: An acceleration scheme able to improve the efficiency of the algorithm is suggested and it is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in the function values is significantly improved.
Journal ArticleDOI
Modelling of electrical energy consumption in an electric arc furnace using artificial neural networks
Dragoljub Gajic,Dragoljub Gajic,Ivana Savic-Gajic,Ivana Savic-Gajic,Ivan M. Savic,Ivan M. Savic,Olga Georgieva,Stefano Di Gennaro +7 more
TL;DR: In this paper, the authors used state-of-the-art artificial neural network approach to estimate the extent and effect of fluctuations in the chemical composition of stainless steel at tapping of an electric arc furnace, and thus scrap and alloy weights in the charge material mix, on the specific electrical energy consumption.
References
More filters
Journal ArticleDOI
A Rapidly Convergent Descent Method for Minimization
Roger Fletcher,M. J. D. Powell +1 more
TL;DR: A number of theorems are proved to show that it always converges and that it converges rapidly, and this method has been used to solve a system of one hundred non-linear simultaneous equations.
Journal ArticleDOI
A family of variable-metric methods derived by variational means
TL;DR: In this paper, a rank-two variable-metric method was derived using Greenstadt's variational approach, which preserves the positive-definiteness of the approximating matrix.
Journal ArticleDOI
A Class of Methods for Solving Nonlinear Simultaneous Equations
TL;DR: In this article, the authors discuss certain modifications to Newton's method designed to reduce the number of function evaluations required during the iterative solution process of an iterative problem solving problem, such that the most efficient process will be that which requires the smallest number of functions evaluations.
Journal ArticleDOI
Quasi-Newton methods and their application to function minimisation
TL;DR: The Newton-Raphson method as mentioned in this paper is one of the most commonly used methods for solving nonlinear problems, where the corrections are computed as linear combinations of the residuals.
Journal ArticleDOI
A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
TL;DR: Transitions whereby inequality constraints of certain forms can be eliminated from the formulation of an optimization problem are described, and examples of their use compared with other methods for handling such constraints are described.