scispace - formally typeset

JournalISSN: 0025-5610

Mathematical Programming 

About: Mathematical Programming is an academic journal. The journal publishes majorly in the area(s): Linear programming & Convex optimization. It has an ISSN identifier of 0025-5610. Over the lifetime, 4304 publication(s) have been published receiving 297861 citation(s).


Papers
More filters
Journal ArticleDOI
TL;DR: A comprehensive description of the primal-dual interior-point algorithm with a filter line-search method for nonlinear programming is provided, including the feasibility restoration phase for the filter method, second-order corrections, and inertia correction of the KKT matrix.
Abstract: We present a primal-dual interior-point algorithm with a filter line-search method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, second-order corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several line-search options, and a comparison is provided with two state-of-the-art interior-point codes for nonlinear programming.

6,326 citations

Journal ArticleDOI
TL;DR: The numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence, and the convergence properties are studied to prove global convergence on uniformly convex problems.
Abstract: We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cycles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence. We show that the L-BFGS method can be greatly accelerated by means of a simple scaling. We then compare the L-BFGS method with the partitioned quasi-Newton method of Griewank and Toint (1982a). The results show that, for some problems, the partitioned quasi-Newton method is clearly superior to the L-BFGS method. However we find that for other problems the L-BFGS method is very competitive due to its low iteration cost. We also study the convergence properties of the L-BFGS method, and prove global convergence on uniformly convex problems.

5,833 citations

Journal ArticleDOI
TL;DR: It is shown that a “greedy” heuristic always produces a solution whose value is at least 1 −[(K − 1/K]K times the optimal value, which can be achieved for eachK and has a limiting value of (e − 1)/e, where e is the base of the natural logarithm.
Abstract: LetN be a finite set andz be a real-valued function defined on the set of subsets ofN that satisfies z(S)+z(T)źz(SźT)+z(SźT) for allS, T inN. Such a function is called submodular. We consider the problem maxSźN{a(S):|S|≤K,z(S) submodular}. Several hard combinatorial optimization problems can be posed in this framework. For example, the problem of finding a maximum weight independent set in a matroid, when the elements of the matroid are colored and the elements of the independent set can have no more thanK colors, is in this class. The uncapacitated location problem is a special case of this matroid optimization problem. We analyze greedy and local improvement heuristics and a linear programming relaxation for this problem. Our results are worst case bounds on the quality of the approximations. For example, whenz(S) is nondecreasing andz(0) = 0, we show that a "greedy" heuristic always produces a solution whose value is at least 1 ź[(K ź 1)/K]K times the optimal value. This bound can be achieved for eachK and has a limiting value of (e ź 1)/e, where e is the base of the natural logarithm.

3,459 citations

Journal ArticleDOI
TL;DR: It is shown that performance profiles combine the best features of other tools for performance evaluation to create a single tool for benchmarking and comparing optimization software.
Abstract: We propose performance profiles — distribution functions for a performance metric — as a tool for benchmarking and comparing optimization software. We show that performance profiles combine the best features of other tools for performance evaluation.

3,115 citations

Journal ArticleDOI
TL;DR: A new approach for constructing efficient schemes for non-smooth convex optimization is proposed, based on a special smoothing technique, which can be applied to functions with explicit max-structure, and can be considered as an alternative to black-box minimization.
Abstract: In this paper we propose a new approach for constructing efficient schemes for non-smooth convex optimization. It is based on a special smoothing technique, which can be applied to functions with explicit max-structure. Our approach can be considered as an alternative to black-box minimization. From the viewpoint of efficiency estimates, we manage to improve the traditional bounds on the number of iterations of the gradient schemes from ** keeping basically the complexity of each iteration unchanged.

2,665 citations

Network Information
Related Journals (5)
arXiv: Optimization and Control

21.7K papers, 187.1K citations

89% related
Journal of Optimization Theory and Applications

7.1K papers, 176.4K citations

88% related
Annals of Operations Research

6.5K papers, 175.8K citations

87% related
Operations Research

6.3K papers, 460.8K citations

87% related
European Journal of Operational Research

19.2K papers, 1M citations

86% related
Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
2021232
2020115
2019119
2018131
2017106
2016117