scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalty and Smoothing Methods for Convex Semi-Infinite Programming

TL;DR: This paper introduces a unified framework concerning Remez-type algorithms and integral methods coupled with penalty and smoothing methods that subsumes well-known classical algorithms, but also provides some new methods with interesting properties.
Abstract: In this paper we consider min-max convex semi-infinite programming. To solve these problems we introduce a unified framework concerning Remez-type algorithms and integral methods coupled with penalty and smoothing methods. This framework subsumes well-known classical algorithms, but also provides some new methods with interesting properties. Convergence of the primal and dual sequences are proved under minimal assumptions.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: A central cutting plane algorithm for solving convex min-max semi-infinite programming problems and an algorithm based on the entropy function is presented.
Abstract: In this paper, we present a central cutting plane algorithm for solving convex min-max semi-infinite programming problems. Because the objective function here is non-differentiable, we apply a smoothing technique to the considered problem and develop an algorithm based on the entropy function. It is shown that the global convergence of the proposed algorithm can be obtained under weaker conditions. Some numerical results are presented to show the potential of the proposed algorithm.

4 citations

Book ChapterDOI
01 Jan 2014
TL;DR: The robust counterpart of an uncertain LSIO problem seldom enjoys the strong assumptions which are necessary to apply reduction or feasible point methods.
Abstract: For many finite optimization problems, numerical methods can be compared from the complexity point of view, i.e., computing upper bounds on the number of iterations, arithmetic operations, etc., necessary to get an optimal solution, or an \(\varepsilon\)-optimal solution, in terms of the size of the problem. This methodology can hardly be applied in LSIO because it is not evident how to define the size of the triplet \(\left (a,b,c\right )\) representing the data of a problem like ( 1.1) despite the seminal results on the complexity of the interior point constraint generation algorithm in [182, 192]. On the other hand, the robust counterpart of an uncertain LSIO problem seldom enjoys the strong assumptions which are necessary to apply reduction or feasible point methods.

4 citations

Journal ArticleDOI
TL;DR: This paper combines the CoMirror algorithm with inexact cut generation to create the SIP-CoM algorithm for solving semi-infinite programming (SIP) problems and proposes two specific random constraint sampling schemes to approximately solve the cut generation problem for generic SIP.
Abstract: The CoMirror algorithm, by Beck et al. (Oper Res Lett 38(6):493–498, 2010), is designed to solve convex optimization problems with one functional constraint. At each iteration, it performs a mirror-descent update using either the subgradient of the objective function or the subgradient of the constraint function, depending on whether or not the constraint violation is below some tolerance. In this paper, we combine the CoMirror algorithm with inexact cut generation to create the SIP-CoM algorithm for solving semi-infinite programming (SIP) problems. First, we provide general error bounds for SIP-CoM. Then, we propose two specific random constraint sampling schemes to approximately solve the cut generation problem for generic SIP. When the objective and constraint functions are generally convex, randomized SIP-CoM achieves an $${\mathcal {O}}(1/\sqrt{N})$$ convergence rate in expectation (in terms of the optimality gap and SIP constraint violation). When the objective and constraint functions are all strongly convex, this rate can be improved to $${\mathcal {O}}(1/N)$$ .

3 citations

01 Jan 2013
TL;DR: In this paper, a procedure for generating convex semi-innite families of test problems with optimal solution and optimal value known is described, motivated by the scarcity of convex semininite test problems.
Abstract: A signicant research activity has occurred in the area of convex semiinnite optimization in the recent years. Many new theoretical, algorithm and computational contribution has been obtained . Despite these numerous contributions, there still exits a lack of representative convex semi-innite test problems. Test problems are of major importance for researchers interested in the algorithmic development. This article is motivated by the scarcity of convex semi-innite test problems and describes a procedure for generating convex semi-innite families of test problems with optimal solution and optimal value known.

2 citations


Cites methods from "Penalty and Smoothing Methods for C..."

  • ...The implementations presented at the end of the paper have been inspired by the computational programming in [1] and are accurately described and implemented in [2]....

    [...]

Book ChapterDOI
01 Jan 2014
TL;DR: In most LSIO applications part of the data are uncertain as a consequence of error measurements or estimations, which is inherent to the data in fields as environmental engineering, telecommunications, finance, spectrometry, health care, statistics, machine learning, or data envelopment analysis.
Abstract: In most LSIO applications part of the data, if not all of them, are uncertain as a consequence of error measurements or estimations. This uncertainty is inherent to the data in fields as environmental engineering, telecommunications, finance, spectrometry, health care, statistics, machine learning, or data envelopment analysis, just to mention some applications listed in Remark 1.3.3.

2 citations

References
More filters
01 Feb 1977

5,933 citations


"Penalty and Smoothing Methods for C..." refers background in this paper

  • ...We recall here some basic notions about asymptotic cones and functions (for more details see, for instance, the books of Auslender and Teboulle [4], Rockafellar [24])....

    [...]

  • ...We recall here some basic notions about asymptotic cones and functions (for more details see, for instance, the books of Auslender and Teboulle [4] and of Rockafellar [24])....

    [...]

Journal ArticleDOI
TL;DR: A new approach for constructing efficient schemes for non-smooth convex optimization is proposed, based on a special smoothing technique, which can be applied to functions with explicit max-structure, and can be considered as an alternative to black-box minimization.
Abstract: In this paper we propose a new approach for constructing efficient schemes for non-smooth convex optimization. It is based on a special smoothing technique, which can be applied to functions with explicit max-structure. Our approach can be considered as an alternative to black-box minimization. From the viewpoint of efficiency estimates, we manage to improve the traditional bounds on the number of iterations of the gradient schemes from ** keeping basically the complexity of each iteration unchanged.

2,948 citations

Book
11 May 2000
TL;DR: It is shown here how the model derived recently in [Bouchut-Boyaval, M3AS (23) 2013] can be modified for flows on rugous topographies varying around an inclined plane.
Abstract: Basic notation.- Introduction.- Background material.- Optimality conditions.- Basic perturbation theory.- Second order analysis of the optimal value and optimal solutions.- Optimal Control.- References.

2,067 citations

Journal ArticleDOI

1,560 citations


"Penalty and Smoothing Methods for C..." refers methods in this paper

  • ...Applied to LSIP, especially Cheney and Goldstein [10] and Kelley [15] turn out to be identical or mere modifications of the dual simplex method discussed above, so that they have similar properties and drawbacks....

    [...]

  • ...Supposing that F is 1 (as is generally the case in ordinary CSIP), we can use cutting-plane methods of Cheney and Goldstein [10], Kelley [15], Veinott [31], or Elzinga and Moore [11], and their variants (see, e.g., Reemtsen and Görner [22] for more references)....

    [...]

  • ...To avoid slow convergence, constraint dropping rules are again given under some conditions as strict convexity on F for Cheney and Goldstein [10] and Kelley [15]....

    [...]

  • ...Supposing that F is 1 (as is generally the case in ordinary CSIP), we can use cutting-plane methods of Cheney and Goldstein [10], Kelley [15], Veinott [31], or Elzinga and Moore [11], and their variants (see, e....

    [...]

Journal ArticleDOI
TL;DR: A class of parametric smooth functions that approximate the fundamental plus function, (x)+=max{0, x}, by twice integrating a probability density function leads to classes of smooth parametric nonlinear equation approximations of nonlinear and mixed complementarity problems (NCPs and MCPs).
Abstract: We propose a class of parametric smooth functions that approximate the fundamental plus function, (x)+=max{0, x}, by twice integrating a probability density function. This leads to classes of smooth parametric nonlinear equation approximations of nonlinear and mixed complementarity problems (NCPs and MCPs). For any solvable NCP or MCP, existence of an arbitrarily accurate solution to the smooth nonlinear equations as well as the NCP or MCP, is established for sufficiently large value of a smoothing parameter α. Newton-based algorithms are proposed for the smooth problem. For strongly monotone NCPs, global convergence and local quadratic convergence are established. For solvable monotone NCPs, each accumulation point of the proposed algorithms solves the smooth problem. Exact solutions of our smooth nonlinear equation for various values of the parameter α, generate an interior path, which is different from the central path for interior point method. Computational results for 52 test problems compare favorably with these for another Newton-based method. The smooth technique is capable of solving efficiently the test problems solved by Dirkse and Ferris [6], Harker and Xiao [11] and Pang & Gabriel [28].

465 citations


"Penalty and Smoothing Methods for C..." refers background in this paper

  • ...In [9], Chen and Mangasarian provided a systematic way to generate elements of 1....

    [...]