scispace - formally typeset
Search or ask a question
Topic

Rate of convergence

About: Rate of convergence is a research topic. Over the lifetime, 31257 publications have been published within this topic receiving 795334 citations. The topic is also known as: convergence rate.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper analyzes several new methods for solving optimization problems with the objective function formed as a sum of two terms, one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure.
Abstract: In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure. Despite the absence of good properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the first part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (with convergence rate $$O\left({1 \over k}\right)$$ ), and an accelerated multistep version with convergence rate $$O\left({1 \over k^2}\right)$$ , where $$k$$ is the iteration counter. For nonconvex problems with this structure, we prove convergence to a point from which there is no descent direction. In contrast, we show that for general nonsmooth, nonconvex problems, even resolving the question of whether a descent direction exists from a point is NP-hard. For all methods, we suggest some efficient “line search” procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.

1,444 citations

Journal ArticleDOI
TL;DR: This paper presents a novel algorithm to accelerate the differential evolution (DE), which employs opposition-based learning (OBL) for population initialization and also for generation jumping and results confirm that the ODE outperforms the original DE and FADE in terms of convergence speed and solution accuracy.
Abstract: Evolutionary algorithms (EAs) are well-known optimization approaches to deal with nonlinear and complex problems. However, these population-based algorithms are computationally expensive due to the slow nature of the evolutionary process. This paper presents a novel algorithm to accelerate the differential evolution (DE). The proposed opposition-based DE (ODE) employs opposition-based learning (OBL) for population initialization and also for generation jumping. In this work, opposite numbers have been utilized to improve the convergence rate of DE. A comprehensive set of 58 complex benchmark functions including a wide range of dimensions is employed for experimental verification. The influence of dimensionality, population size, jumping rate, and various mutation strategies are also investigated. Additionally, the contribution of opposite numbers is empirically verified. We also provide a comparison of ODE to fuzzy adaptive DE (FADE). Experimental results confirm that the ODE outperforms the original DE and FADE in terms of convergence speed and solution accuracy.

1,419 citations

Posted Content
TL;DR: This paper analyzes several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known.
Abstract: In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. Despite to the bad properties of the sum, such problems, both in convex and nonconvex cases, can be solved with eciency typical for the good part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (converge as O ‡ 1 k · ), and an accelerated multistep version with convergence rate O ‡ 1 k2 · , where k is the iteration counter. For all methods, we suggest some ecient “line search” procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.

1,338 citations

Journal ArticleDOI
TL;DR: In this article, the Navier-Stokes equations are modified by the addition of the continuum forcing [emailprotected]?->@f, where C is the composition variable and @f is C's chemical potential.

1,263 citations

Book
01 Jan 1997
TL;DR: Applications and issues application to learning, state dependent noise and queueing applications to signal processing and adaptive control mathematical background convergence with probability one, introduction weak convergence methods for general algorithms applications, proofs of convergence rate of convergence averaging of the iterates distributed/decentralized and asynchronous algorithms.
Abstract: Applications and issues application to learning, state dependent noise and queueing applications to signal processing and adaptive control mathematical background convergence with probability one - Martingale difference noise convergence with probability one - correlated noise weak convergence - introduction weak convergence methods for general algorithms applications - proofs of convergence rate of convergence averaging of the iterates distributed/decentralized and asynchronous algorithms.

1,172 citations


Network Information
Related Topics (5)
Partial differential equation
70.8K papers, 1.6M citations
89% related
Markov chain
51.9K papers, 1.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Differential equation
88K papers, 2M citations
88% related
Nonlinear system
208.1K papers, 4M citations
88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
2023693
20221,530
20212,129
20202,036
20191,995