scispace - formally typeset
Search or ask a question
Topic

Rate of convergence

About: Rate of convergence is a research topic. Over the lifetime, 31257 publications have been published within this topic receiving 795334 citations. The topic is also known as: convergence rate.


Papers
More filters
Journal ArticleDOI
TL;DR: The result is a bandwidth selector with the, by nonparametric standards, extremely fast asymptotic rate of convergence of n−½ where n → ∞ denotes sample size.
Abstract: A bandwidth selection method is proposed for kernel density estimation. This is based on the straightforward idea of plugging estimates into the usual asymptotic representation for the optimal bandwidth, but with two important modifications. The result is a bandwidth selector with the, by nonparametric standards, extremely fast asymptotic rate of convergence of n−½ where n → ∞ denotes sample size. Comparison is given to other bandwidth selection methods, and small sample impact is investigated.

282 citations

Journal ArticleDOI
TL;DR: An extension of Newton's method for unconstrained multiobjective optimization (multicriteria optimization) that is locally superlinear convergent to optimal points and uses a Kantorovich-like technique.
Abstract: We propose an extension of Newton's method for unconstrained multiobjective optimization (multicriteria optimization). This method does not use a priori chosen weighting factors or any other form of a priori ranking or ordering information for the different objective functions. Newton's direction at each iterate is obtained by minimizing the max-ordering scalarization of the variations on the quadratic approximations of the objective functions. The objective functions are assumed to be twice continuously differentiable and locally strongly convex. Under these hypotheses, the method, as in the classical case, is locally superlinear convergent to optimal points. Again as in the scalar case, if the second derivatives are Lipschitz continuous, the rate of convergence is quadratic. Our convergence analysis uses a Kantorovich-like technique. As a byproduct, existence of optima is obtained under semilocal assumptions.

282 citations

Journal ArticleDOI
TL;DR: This work constructs a stable high-order finite difference scheme for the compressible Navier-Stokes equations, that satisfy an energy estimate, and shows the theoretical third-, fourth-, and fifth-order convergence rate, for a viscous shock, where the analytic solution is known.

281 citations

Proceedings Article
06 Aug 2017
TL;DR: In this article, the authors show that perturbed gradient descent can escape saddle points almost for free, in a number of iterations which depends only poly-logarithmically on dimension.
Abstract: This paper shows that a perturbed form of gradient descent converges to a second-order stationary point in a number iterations which depends only poly-logarithmically on dimension (i.e., it is almost "dimension-free"). The convergence rate of this procedure matches the well-known convergence rate of gradient descent to first-order stationary points, up to log factors. When all saddle points are non-degenerate, all second-order stationary points are local minima, and our result thus shows that perturbed gradient descent can escape saddle points almost for free. Our results can be directly applied to many machine learning applications, including deep learning. As a particular concrete example of such an application, we show that our results can be used directly to establish sharp global convergence rates for matrix factorization. Our results rely on a novel characterization of the geometry around saddle points, which may be of independent interest to the non-convex optimization community.

280 citations

Journal ArticleDOI
TL;DR: In this paper, Liu et al. extended the smoothed finite element method (SFEM) to a more general case, where the problem domain can be discretized by a set of polygons, each with an arbitrary number of sides.

280 citations


Network Information
Related Topics (5)
Partial differential equation
70.8K papers, 1.6M citations
89% related
Markov chain
51.9K papers, 1.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Differential equation
88K papers, 2M citations
88% related
Nonlinear system
208.1K papers, 4M citations
88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
2023693
20221,530
20212,129
20202,036
20191,995