scispace - formally typeset
Search or ask a question
Topic

Rate of convergence

About: Rate of convergence is a research topic. Over the lifetime, 31257 publications have been published within this topic receiving 795334 citations. The topic is also known as: convergence rate.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that consistency between the tangent operator and the integration algorithm employed in the solution of the incremental problem plays crucial role in preserving the quadratic rate of asymptotic convergence of iterative solution schemes based upon Newton's method.

1,702 citations

Journal ArticleDOI
Jushan Bai1
TL;DR: In this paper, the authors developed an inferential theory for factor models of large dimensions and derived the rate of convergence and the limiting distributions of the estimated factors, factor loadings, and common components.
Abstract: This paper develops an inferential theory for factor models of large dimensions. The principal components estimator is considered because it is easy to compute and is asymptotically equivalent to the maximum likelihood estimator (if normality is assumed). We derive the rate of convergence and the limiting distributions of the estimated factors, factor loadings, and common components. The theory is developed within the framework of large cross sections (N) and a large time dimension (T), to which classical factor analysis does not apply. We show that the estimated common components are asymptotically normal with a convergence rate equal to the minimum of the square roots of N and T. The estimated factors and their loadings are generally normal, although not always so. The convergence rate of the estimated factors and factor loadings can be faster than that of the estimated common components. These results are obtained under general conditions that allow for correlations and heteroskedasticities in both dimensions. Stronger results are obtained when the idiosyncratic errors are serially uncorrelated and homoskedastic. A necessary and sufficient condition for consistency is derived for large N but fixed T.

1,599 citations

Journal ArticleDOI
TL;DR: The main purpose of this paper is to provide an algorithm with a restart procedure that takes account of the objective function automatically and to study a multiplying factor that occurs in the definition of the search direction of each iteration.
Abstract: The conjugate gradient method is particularly useful for minimizing functions of very many variables because it does not require the storage of any matrices. However the rate of convergence of the algorithm is only linear unless the iterative procedure is "restarted" occasionally. At present it is usual to restart everyn or (n + 1) iterations, wheren is the number of variables, but it is known that the frequency of restarts should depend on the objective function. Therefore the main purpose of this paper is to provide an algorithm with a restart procedure that takes account of the objective function automatically. Another purpose is to study a multiplying factor that occurs in the definition of the search direction of each iteration. Various expressions for this factor have been proposed and often it does not matter which one is used. However now some reasons are given in favour of one of these expressions. Several numerical examples are reported in support of the conclusions of this paper.

1,588 citations

Journal ArticleDOI
TL;DR: It is shown that stabilization of the ldquounconstrainedrdquo system is sufficient to solve the stated problem and guarantees a uniform ultimate boundedness property for the transformed output error and the uniform boundedness for all other signals in the closed loop.
Abstract: A novel robust adaptive controller for multi-input multi-output (MIMO) feedback linearizable nonlinear systems possessing unknown nonlinearities, capable of guaranteeing a prescribed performance, is developed in this paper. By prescribed performance we mean that the tracking error should converge to an arbitrarily small residual set, with convergence rate no less than a prespecified value, exhibiting a maximum overshoot less than a sufficiently small prespecified constant. Visualizing the prescribed performance characteristics as tracking error constraints, the key idea is to transform the ldquoconstrainedrdquo system into an equivalent ldquounconstrainedrdquo one, via an appropriately defined output error transformation. It is shown that stabilization of the ldquounconstrainedrdquo system is sufficient to solve the stated problem. Besides guaranteeing a uniform ultimate boundedness property for the transformed output error and the uniform boundedness for all other signals in the closed loop, the proposed robust adaptive controller is smooth with easily selected parameter values and successfully bypasses the loss of controllability issue. Simulation results on a two-link robot, clarify and verify the approach.

1,475 citations

Journal ArticleDOI
TL;DR: Surprisingly enough, for certain classes of objective functions, the proposed methods for solving huge-scale optimization problems are better than the standard worst-case bounds for deterministic algorithms.
Abstract: In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

1,454 citations


Network Information
Related Topics (5)
Partial differential equation
70.8K papers, 1.6M citations
89% related
Markov chain
51.9K papers, 1.3M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Differential equation
88K papers, 2M citations
88% related
Nonlinear system
208.1K papers, 4M citations
88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
2023693
20221,530
20212,129
20202,036
20191,995