Topic
Monotone polygon
About: Monotone polygon is a research topic. Over the lifetime, 14573 publications have been published within this topic receiving 270286 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: The theory of the proximal point algorithm for maximal monotone operators is applied to three algorithms for solving convex programs, one of which has not previously been formulated and is shown to have much the same convergence properties, but with some potential advantages.
Abstract: The theory of the proximal point algorithm for maximal monotone operators is applied to three algorithms for solving convex programs, one of which has not previously been formulated Rate-of-convergence results for the “method of multipliers,” of the strong sort already known, are derived in a generalized form relevant also to problems beyond the compass of the standard second-order conditions for oplimality The new algorithm, the “proximal method of multipliers,” is shown to have much the same convergence properties, but with some potential advantages
1,221 citations
•
01 Jul 1984
TL;DR: In this article, a set of trajectories of Convex-Valued Differential Inclusions with Maximal Monotone Maps are described. But the complexity of the set of Trajectories of a differential inclusion is not discussed.
Abstract: 0 Background Notes- 1 Continuous Partitions of Unity- 2 Absolutely Continuous Functions- 3 Some Compactness Theorems- 4 Weak Convergence and Asymptotic Center of Bounded Sequences- 5 Closed Convex Hulls and the Mean-Value Theorem- 6 Lower Semicontinuous Convex Functions and Projections of Best Approximation- 7 A Concise Introduction to Convex Analysis- 1 Set-Valued Maps- 1 Set-Valued Maps and Continuity Concepts- 2 Examples of Set-Valued Maps- 3 Continuity Properties of Maps with Closed Convex Graph- 4 Upper Hemicontinuous Maps and the Convergence Theorem- 5 Hausdorff Topology- 6 The Selection Problem- 7 The Minimal Selection- 8 Chebishev Selection- 9 The Barycentric Selection- 10 Selection Theorems for Locally Selectionable Maps- 11 Michael's Selection Theorem- 12 The Approximate Selection Theorem and Kakutani's Fixed Point Theorem- 13 (7-Selectionable Maps- 14 Measurable Selections- 2 Existence of Solutions to Differential Inclusions- 1 Convex Valued Differential Inclusions- 2 Qualitative Properties of the Set of Trajectories of Convex-Valued Differential Inclusions- 3 Nonconvex-Valued Differential Inclusions- 4 Differential Inclusions with Lipschitzean Maps and the Relaxation Theorem- 5 The Fixed-Point Approach- 6 The Lower Semicontinuous Case- 3 Differential Inclusions with Maximal Monotone Maps- 1 Maximal Monotone Maps- 2 Existence and Uniqueness of Solutions to Differential Inclusions with Maximal Monotone Maps- 3 Asymptotic Behavior of Trajectories and the Ergodic Theorem- 4 Gradient Inclusions- 5 Application: Gradient Methods for Constrained Minimization Problems- 4 Viability Theory: The Nonconvex Case- 1 Bouligand's Contingent Cone- 2 Viable and Monotone Trajectories- 3 Contingent Derivative of a Set-Valued Map- 4 The Time Dependent Case- 5 A Continuous Version of Newton's Method- 6 A Viability Theorem for Continuous Maps with Nonconvex Images- 7 Differential Inclusions with Memory- 5 Viability Theory and Regulation of Controled Systems: The Convex Case- 1 Tangent Cones and Normal Cones to Convex Sets- 2 Viability Implies the Existence of an Equilibrium- 3 Viability Implies the Existence of Periodic Trajectories- 4 Regulation of Controled Systems Through Viability- 5 Walras Equilibria and Dynamical Price Decentralization- 6 Differential Variational Inequalities- 7 Rate Equations and Inclusions- 6 Liapunov Functions- 1 Upper Contingent Derivative of a Real-Valued Function- 2 Liapunov Functions and Existence of Equilibria- 3 Monotone Trajectories of a Differential Inclusion- 4 Construction of Liapunov Functions- 5 Stability and Asymptotic Behavior of Trajectories- Comments
1,156 citations
••
TL;DR: The Krasnoselskii?Mann (KM) approach to finding fixed points of nonlinear continuous operators on a Hilbert space was introduced in this article, where a wide variety of iterative procedures used in signal processing and image reconstruction and elsewhere are special cases of the KM iterative procedure.
Abstract: Let T be a (possibly nonlinear) continuous operator on Hilbert space . If, for some starting vector x, the orbit sequence {Tkx,k = 0,1,...} converges, then the limit z is a fixed point of T; that is, Tz = z. An operator N on a Hilbert space is nonexpansive?(ne) if, for each x and y in , Even when N has fixed points the orbit sequence {Nkx} need not converge; consider the example N = ?I, where I denotes the identity operator. However, for any the iterative procedure defined by converges (weakly) to a fixed point of N whenever such points exist. This is the Krasnoselskii?Mann (KM) approach to finding fixed points of ne operators. A wide variety of iterative procedures used in signal processing and image reconstruction and elsewhere are special cases of the KM iterative procedure, for particular choices of the ne operator N. These include the Gerchberg?Papoulis method for bandlimited extrapolation, the SART algorithm of Anderson and Kak, the Landweber and projected Landweber algorithms, simultaneous and sequential methods for solving the convex feasibility problem, the ART and Cimmino methods for solving linear systems of equations, the CQ algorithm for solving the split feasibility problem and Dolidze's procedure for the variational inequality problem for monotone operators.
1,100 citations
••
TL;DR: It is proven that logarithmic negativity does not increase on average under a general positive partial transpose preserving operation (a set of operations that incorporate local operations and classical communication as a subset), which is surprising, as it is generally considered that convexity describes the local physical process of losing information.
Abstract: It is proven that logarithmic negativity does not increase on average under a general positive partial transpose preserving operation (a set of operations that incorporate local operations and classical communication as a subset) and, in the process, a further proof is provided that the negativity does not increase on average under the same set of operations. Given that the logarithmic negativity is not a convex function this result is surprising, as it is generally considered that convexity describes the local physical process of losing information. The role of convexity and, in particular, its relation (or lack thereof) to physical processes is discussed and importance of continuity in this context is stressed.
1,087 citations
••
TL;DR: In this paper, a stepsize procedure is proposed to maintain monotone decrease of an exact penalty function, and the convergence of the damped Newton method is globalized in unconstrained optimization.
Abstract: Recently developd Newton and quasi-Newton methods for nonlinear programming possess only local convergence properties. Adopting the concept of the damped Newton method in unconstrained optimization, we propose a stepsize procedure to maintain monotone decrease of an exact penalty function. In so doing, the convergence of the method is globalized. Keywords: nonlinear programming, global convergence, exact penalty function.
1,077 citations