scispace - formally typeset
Journal ArticleDOI

Sharp Lipschitz Constants for Basic Optimal Solutions and Basic Feasible Solutions of Linear Programs

Wu Li
- 01 Jan 1994 - 
- Vol. 32, Iss: 1, pp 140-153
TLDR
In this paper, the Lipschitz constants for basic optimal solutions and basic feasible solutions of linear programs with respect to right-hand side perturbations are given in terms of norms of pseudoinverses of submatrices of the matrices involved.
Abstract
The main purpose of this paper to give Lipschitz constants for basic optimal solutions (or vertices of solution sets) and basic feasible solutions (or vertices of feasible sets) of linear programs with respect to right-hand side perturbations. The Lipschitz constants are given in terms of norms of pseudoinverses of submatrices of the matrices involved, and are sharp under very general assumptions. There are two mathematical principles involved in deriving the Lipschitz constants: (1) the local upper Lipschitz constant of a Hausdorff lower semicontinuous mapping is equal to the Lipschitz constant of the mapping and (2) the Lipschitz constant of a finite- set-valued mapping can be inherited by its continuous submappings. Moreover, it is proved that any Lipschitz constant for basic feasible solutions can be used as an Lipschitz constant for basic optimal solutions, feasible solutions, and optimal solutions.

read more

Citations
More filters
Book

Implicit Functions and Solution Mappings

TL;DR: In this article, the authors provide a reference on the topic and a unified collection of a number of results which are currently scattered throughout the literature, including implicit mappings defined by relations other than equations.
Book

Implicit Functions and Solution Mappings: A View from Variational Analysis

TL;DR: In this paper, the authors define implicit functions defined implicitly by equations, and derive regularity properties of set-valued solution mappings through generalized derivatives, and apply them in Numerical Variational Analysis.

Iteration complexity of feasible descent methods for onvex optimization

P.-W. Wang, +1 more
TL;DR: In this paper, the authors proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems such as support vector classification and regression.
Journal Article

Iteration complexity of feasible descent methods for convex optimization

TL;DR: The global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems is proved and the first to prove O(log(1/e) time complexity of cyclic coordinate descent methods on dual problems of support vector classification and regression is proved.
Journal ArticleDOI

A unified analysis of hoffman's bound via fenchel duality*

TL;DR: An extension of Hoffman’s bound to a partially infinite-dimensional setting in which the norm defining the distance is replaced by a positively homogeneous convex function and the linear system is replaced with a convex inclusion of the form $Ax - a \in K$ is considered.