scispace - formally typeset
Search or ask a question
Topic

Strongly monotone

About: Strongly monotone is a research topic. Over the lifetime, 1656 publications have been published within this topic receiving 37043 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the proximal point algorithm in exact form is investigated in a more general form where the requirement for exact minimization at each iteration is weakened, and the subdifferential $\partial f$ is replaced by an arbitrary maximal monotone operator T.
Abstract: For the problem of minimizing a lower semicontinuous proper convex function f on a Hilbert space, the proximal point algorithm in exact form generates a sequence $\{ z^k \} $ by taking $z^{k + 1} $ to be the minimizes of $f(z) + ({1 / {2c_k }})\| {z - z^k } \|^2 $, where $c_k > 0$. This algorithm is of interest for several reasons, but especially because of its role in certain computational methods based on duality, such as the Hestenes-Powell method of multipliers in nonlinear programming. It is investigated here in a more general form where the requirement for exact minimization at each iteration is weakened, and the subdifferential $\partial f$ is replaced by an arbitrary maximal monotone operator T. Convergence is established under several criteria amenable to implementation. The rate of convergence is shown to be “typically” linear with an arbitrarily good modulus if $c_k $ stays large enough, in fact superlinear if $c_k \to \infty $. The case of $T = \partial f$ is treated in extra detail. Applicati...

3,238 citations

Journal ArticleDOI
TL;DR: A modification to the forward-backward splitting method for finding a zero of the sum of two maximal monotone mappings is proposed, under which the method converges assuming only the forward mapping is (Lipschitz) continuous on some closed convex subset of its domain.
Abstract: We consider the forward-backward splitting method for finding a zero of the sum of two maximal monotone mappings. This method is known to converge when the inverse of the forward mapping is strongly monotone. We propose a modification to this method, in the spirit of the extragradient method for monotone variational inequalities, under which the method converges assuming only the forward mapping is (Lipschitz) continuous on some closed convex subset of its domain. The modification entails an additional forward step and a projection step at each iteration. Applications of the modified method to decomposition in convex programming and monotone variational inequalities are discussed.

935 citations

Journal ArticleDOI
TL;DR: In this article, it was shown that for any two monotone operators Tx and T2 from X to X*, the operator F», + T2 is again monotonous.
Abstract: is called the effective domain of F, and F is said to be locally bounded at a point x e D(T) if there exists a neighborhood U of x such that the set (1.4) T(U) = (J{T(u)\ueU} is a bounded subset of X. It is apparent that, given any two monotone operators Tx and T2 from X to X*, the operator F», + T2 is again monotone, where (1 5) (Ti + T2)(x) = Tx(x) + T2(x) = {*? +x% I xf e Tx(x), xt e T2(x)}. If Tx and F2 are maximal, it does not necessarily follow, however, that F», + T2 is maximal—some sort of condition is needed, since for example the graph of Tx + T2 can even be empty (as happens when D(Tx) n D(T2)= 0). The problem of determining conditions under which Tx + T2 is maximal turns out to be of fundamental importance in the theory of monotone operators. Results in this direction have been proved by Lescarret [9] and Browder [5], [6], [7]. The strongest result which is known at present is :

922 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that when H is a real Hilbert space and f: H→R is a differentiable convex function whose minimal value is achieved, then each solution trajectory t→x(t) of this system weakly converges towards a solution of ∇f(x)=0.
Abstract: The ‘heavy ball with friction’ dynamical system x + γx + ∇f(x)=0 is a nonlinear oscillator with damping (γ>0). It has been recently proved that when H is a real Hilbert space and f: H→R is a differentiable convex function whose minimal value is achieved, then each solution trajectory t→x(t) of this system weakly converges towards a solution of ∇f(x)=0. We prove a similar result in the discrete setting for a general maximal monotone operator A by considering the following iterative method: x k+1−x k −α k (x k −x k−1)+λ k A(x k+1)∋0, giving conditions on the parameters λ k and α k in order to ensure weak convergence toward a solution of 0∈A(x) and extending classical convergence results concerning the standard proximal method.

585 citations


Network Information
Related Topics (5)
Uniqueness
40.1K papers, 670K citations
83% related
Bounded function
77.2K papers, 1.3M citations
80% related
Ordinary differential equation
33.1K papers, 590.4K citations
80% related
Stochastic partial differential equation
21.1K papers, 707.2K citations
79% related
Rate of convergence
31.2K papers, 795.3K citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202241
202162
202039
201946
201842