scispace - formally typeset
Search or ask a question
Topic

Rate of convergence

About: Rate of convergence is a research topic. Over the lifetime, 31257 publications have been published within this topic receiving 795334 citations. The topic is also known as: convergence rate.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper analyzes the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order and establishes global sublinear rate of convergence.
Abstract: In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of convergence of this method is established and it is shown that it can be accelerated when the problem is unconstrained. In the unconstrained setting we also prove a sublinear rate of convergence result for the so-called alternating minimization method when the number of blocks is two. When the objective function is also assumed to be strongly convex, linear rate of convergence is established.

576 citations

Book ChapterDOI
19 Sep 2016
TL;DR: Recently, this paper showed that the Polyak-Łojasiewicz PL inequality is actually weaker than the main conditions that have been explored to show linear convergence rates without strong convexity over the last 25 years.
Abstract: In 1963, Polyak proposed a simple condition that is sufficient to show a global linear convergence rate for gradient descent. This condition is a special case of the Łojasiewicz inequality proposed in the same year, and it does not require strong convexity or even convexity. In this work, we show that this much-older Polyak-Łojasiewicz PL inequality is actually weaker than the main conditions that have been explored to show linear convergence rates without strong convexity over the last 25 years. We also use the PL inequality to give new analyses of coordinate descent and stochastic gradient for many non-strongly-convex and some non-convex functions. We further propose a generalization that applies to proximal-gradient methods for non-smooth optimization, leading to simple proofs of linear convergence for support vector machines and L1-regularized least squares without additional assumptions.

576 citations

Journal ArticleDOI
TL;DR: The final convergence result shows clearly how the regularity of the solution and the grading of the mesh affect the order of convergence of the difference scheme, so one can choose an optimal mesh grading.
Abstract: A reaction-diffusion problem with a Caputo time derivative of order $\alpha\in (0,1)$ is considered. The solution of such a problem is shown in general to have a weak singularity near the initial time $t=0$, and sharp pointwise bounds on certain derivatives of this solution are derived. A new analysis of a standard finite difference method for the problem is given, taking into account this initial singularity. This analysis encompasses both uniform meshes and meshes that are graded in time, and includes new stability and consistency bounds. The final convergence result shows clearly how the regularity of the solution and the grading of the mesh affect the order of convergence of the difference scheme, so one can choose an optimal mesh grading. Numerical results are presented that confirm the sharpness of the error analysis.

573 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that an implicit variant of Euler-Maruyama converges if the diffusion coefficient is globally Lipschitz, but the drift coefficient satisfies only a one-sided Lipschnitz condition.
Abstract: Traditional finite-time convergence theory for numerical methods applied to stochastic differential equations (SDEs) requires a global Lipschitz assumption on the drift and diffusion coefficients. In practice, many important SDE models satisfy only a local Lipschitz property and, since Brownian paths can make arbitrarily large excursions, the global Lipschitz-based theory is not directly relevant. In this work we prove strong convergence results under less restrictive conditions. First, we give a convergence result for Euler--Maruyama requiring only that the SDE is locally Lipschitz and that the pth moments of the exact and numerical solution are bounded for some p >2. As an application of this general theory we show that an implicit variant of Euler--Maruyama converges if the diffusion coefficient is globally Lipschitz, but the drift coefficient satisfies only a one-sided Lipschitz condition; this is achieved by showing that the implicit method has bounded moments and may be viewed as an Euler--Maruyama approximation to a perturbed SDE of the same form. Second, we show that the optimal rate of convergence can be recovered if the drift coefficient is also assumed to behave like a polynomial.

570 citations

Journal ArticleDOI
TL;DR: It is shown that the fractional Crank-Nicholson method based on the shifted Grunwald formula is unconditionally stable and compared with the exact analytical solution for its order of convergence.

557 citations


Network Information
Related Topics (5)
Partial differential equation
70.8K papers, 1.6M citations
89% related
Markov chain
51.9K papers, 1.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Differential equation
88K papers, 2M citations
88% related
Nonlinear system
208.1K papers, 4M citations
88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
2023693
20221,530
20212,129
20202,036
20191,995