scispace - formally typeset
Search or ask a question

Showing papers by "Margaret H. Wright published in 1986"


ReportDOI
01 Jan 1986
TL;DR: NPSOL as discussed by the authors is a set of Fortran subroutines designed to minimize a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints.
Abstract: : This report forms the user's guide for Version 4.0 of NPSOL, a set of Fortran subroutines designed to minimize a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints. (NPSOL may also be used for unconstrained, bound-constrained and linearly constrained optimization.) The user must provide subroutines that define the objective and constraint functions and (optionally) their gradients. All matrices are treated as dense, and hence NPSOL is not intended for large sparse problems. NPSOL uses a sequential quadratic programming (SQP) algorithm, in which the search directions is the solution of a quadratic programming (QP) subproblem. The algorithm treats bounds, linear constraints and nonlinear constraints separately. The Hessian of each QP subproblem is a positive-definite quasi-Newton approximation to the Hessian of the Lagrangian function. The steplength at each iteration is required to produce a sufficient decrease an augmented Lagrangian merit function. Each QP subproblem is solved using a quadratic programming package with several features that improve the efficiency of an SQP algorithm. (Author)

471 citations


Journal ArticleDOI
TL;DR: This work reviews classical barrier-function methods for nonlinear programming based on applying a logarithmic transformation to inequality constraints and shows a “projected Newton barrier” method to be equivalent to Karmarkar's projective method for a particular choice of the barrier parameter.
Abstract: Interest in linear programming has been intensified recently by Karmarkar's publication in 1984 of an algorithm that is claimed to be much faster than the simplex method for practical problems. We review classical barrier-function methods for nonlinear programming based on applying a logarithmic transformation to inequality constraints. For the special case of linear programming, the transformed problem can be solved by a "projected Newton barrier" method. This method is shown to be equivalent to Karmarkar's projective method for a particular choice of the barrier parameter. We then present details of a specific barrier algorithm and its practical implementation. Numerical results are given for several non-trivial test problems, and the implications for future developments in linear programming are discussed.

431 citations


01 Jan 1986
TL;DR: This report forms the user's guide for Version 4.0 of NPSOL, a set of Fortran subroutines designed to minimize a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints.
Abstract: This report forms the user's guide for Version 4.0 of NPSOL, a set of Fortran subroutines designed to minimize a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints. (NPSOL may also be used for unconstrained, bound-constrained and linearly constrained optimization.). The user must provide subroutines that define the objective and constraint functions and (optionally) their gradients. All matrices are treated as dense, and hence NPSOL is not intended for large sparse problems. NPSOL uses a sequential quadratic programming (SQP) algorithm, in which the search direction is the solution of a quadratic programming (QP) subproblem. The algorithm treats bounds, linear constraints and nonlinear constraints separately. The Hessian of each QP subproblem is a positive-definite quasi-Newton approximation to the Hessian of the Lagrangian function. The steplength at each iteration is required to produce a sufficient decrease in an augmented Lagrangian merit function. Each QP subproblem is solved using a quadratic programming package with several features that improve the efficiency of an SQP algorithm.

202 citations


01 Jan 1986
TL;DR: This report forms the user's guide for Version 1.0 of LSSOL, a set of Fortran 77 subroutines for linearly constrained linear least-squares and convex quadratic programming and its exploitation of convexity and treatment of singularity.
Abstract: This report forms the user's guide for Version 1.0 of LSSOL, a set of Fortran 77 subroutines for linearly constrained linear least-squares and convex quadratic programming. The method of LSSOL is of the two-phase, active-set type, and is related to the method used in the package SOL/QPSOL. Two main features of LSSOL are its exploitation of convexity and treatment of singularity. LSSOL may also be used for linear programming, and to find a feasible point with respect to a set of linear inequality constraints. LSSOL treats all matrices as dense, and hence is not intended for large sparse problems.

99 citations


01 Apr 1986
TL;DR: In this paper, a smooth augmented Lagrangian merit function is defined, in which the Lagrange multiplier estimate is treated as a separate variable, and inequality constraints are handled by means of non-negative slack variables that included in the linesearch.
Abstract: Sequential quadratic programming (SQP) methods for nonlinearly constrained optimization typically use of a merit function to enforce convergence from an arbitrary starting point. We define a smooth augmented Lagrangian merit function in which the Lagrange multiplier estimate is treated as a separate variable, and inequality constraints are handled by means of non-negative slack variables that included in the linesearch. Global convergence is proved of an SQP algorithm that uses merit function. It is also prove, that steps of unity are accepted in a neighborhood of the solution when this merit function is used in a suitable superlinearly convergent algorithm. Finally, a selection of numerical results is presented to illustrate the performance of the associated SQP method. 36 refs., 1 tab.

93 citations


01 Apr 1986
TL;DR: In this article, the authors describe application of a barrier transformation to the dual, and the development of sparse least squares methods based on the LU factorization of the least-squares matrix of its transpose.
Abstract: : Certain new approaches to linear programming have recently received considerable publicity because of the promise of substantial improvements in efficiency compared to the simplex method. This note briefly discusses several research directions in methods for solving linear programs using nonlinear problem transformations. In particular, we describe application of a barrier transformation to the dual, and the development of sparse least-squares methods based on the LU factorization of the least-squares matrix of its transpose. (Author)

13 citations


Book ChapterDOI
01 Jan 1986
TL;DR: This paper describes some of the important issues of numerical analysis in implementing a sequential quadratic programming method for nonlinearly constrained optimization and the results of applying the method to two specific examples are analyzed in detail.
Abstract: This paper describes some of the important issues of numerical analysis in implementing a sequential quadratic programming method for nonlinearly constrained optimization We consider the separate treatment of linear constraints, design of a specialized quadratic programming algorithm, and control of ill-conditioning The results of applying the method to two specific examples are analyzed in detail

6 citations


01 Jan 1986
TL;DR: This report forms the user's guide for Version 4.0 of NPSOL, a set of Fortran subroutines designed to minimize a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints.
Abstract: : This report forms the user's guide for Version 40 of NPSOL, a set of Fortran subroutines designed to minimize a smooth function subject to constraints, which may include simple bounds on the variables, linear constraints and smooth nonlinear constraints (NPSOL may also be used for unconstrained, bound-constrained and linearly constrained optimization) The user must provide subroutines that define the objective ans constraint functions and (optionally) their gradients All matrices are treated as dense, and hence NPSOL is not intended for large sparse problems NPSOL uses a sequential quadratic programming (SQP) algorithm, in which the search direction is the solution of a quadratic programming (QP) subproblem The algorithm treats bounds, linear constraints and nonlinear constraints separately The Hessian of each QP subproblem is a positive-definite quasi-Newton approximation to the Hessiau of the Lagrangian function The steplength at each iteration is required to produce a sufficient decrease in an augmented Lagrangian merit function Each QP subproblem is solved using a quadratic programming package with several features that improve the efficiency of an SQP algorithm Keywords: Mathematical software; Nonlinear programming; and Finite difference (Author)

6 citations


01 Jan 1986
TL;DR: The user's guide for Version 1.0 of LSSOL, a set of Fortran 77 subroutines for linearly constrained linear least-squares and convex quadratic programming, and its exploitation of convexity and treatment of singularity is formed.
Abstract: : This report forms the user's guide for Version 1.0 of LSSOL, a set of Fortran 77 subroutines for linearly constrained linear least-squares and convex quadratic programming. The method of LSSOL is of the two-phase, active-set type, and is related to the method used in the package SOL/QPSOL. Two main features of LSSOL are its exploitation of convexity and treatment of singularity. LSSOL may also be used for linear programming, and to find a feasible point with respect to a set of linear inequality constraints. LSSOL treats all matrices as dense, and hence is not intended for large sparse problems. Keywords: Algorithms; Parameters; Optimization; Linear programming; Mathematical software. (Author)

2 citations