scispace - formally typeset
Search or ask a question
Topic

Recursive least squares filter

About: Recursive least squares filter is a research topic. Over the lifetime, 8907 publications have been published within this topic receiving 191933 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The convergence theorems of the parameter estimation by the RLS algorithms are given, and the conditions that the parameter estimates consistently converge to the true parameters under noise time-varying variance and unbounded condition number are derived.

47 citations

Journal ArticleDOI
TL;DR: An algorithm is given for solving linear least squares systems of algebraic equations subject to simple bounds on the unknowns and (more general) linear equality and inequality constraints.
Abstract: An algorithm is given for solving linear least squares systems of algebraic equations subject to simple bounds on the unknowns and (more general) linear equality and inequality constraints.The method used is a penalty function approach wherein the linear constraints are (effectively) heavily weighted. The resulting system is then solved as an ordinary bounded least squares system except for some important numerical and algorithmic details.This report is a revision of an earlier work. It reflects some hard-won experience gained while using the resulting software to solve nonlinear constrained least squares problems.

47 citations

01 Jan 2010
TL;DR: In this paper, the authors present a Matlab toolbox which can solve basic problems related to the total least squares method in the modeling, which is also known as error-in-variables method or orthogonal regression method.
Abstract: This paper deals with a mathematical method known as total least squares or orthogonal regression or error-in-variables method. The mentioned method can be used for modeling of static and also dynamic processes. A wide area of other applications is in signal and image processing. We also present a Matlab toolbox which can solve basic problems related to the total least squares method in the modeling. Several illustrative examples are presented as well. In this paper we present the total least squares method (TLS), which is also known as error-in-variables method or orthogonal regression method. One could hardly name another method which is used as frequently as the method which is known as the least squares method. At the same time, it is difficult to name another method which was accompanied by such strong and long lasting controversy as the least squares method. It is also difficult to find another method, which is so easy and at the same time as artificial as the least squares method. The story of the birth of the least squares method is well covered in the literature and can be summarized as follows [4, 6, 12, 17]. The priority in publication definitely belongs to A. M. Legendre (1805), who also gave the method its famous name, but C. F. Gauss (1809) claimed that he knew and used this method much earlier, about 10 years before the Legendre's publications. Gauss's arguments for his priority were not perfect at all. His diaries with computations claimed to be made by the least squares method were lost. It is also known that when H. C. Schumacher suggested repeating Gauss's lost computations claimed to be done by the least squares method, Gauss totally rejected this idea with the words that such attempts would only suggest that he could not be trusted.

47 citations

Proceedings ArticleDOI
10 Dec 2002
TL;DR: This paper provides the theoretical analysis of the L MS algorithm where the length mismatch between the adaptive filter and the unknown filter is taken into account and a new variable length LMS algorithm is introduced.
Abstract: This paper addresses the problem of finding the optimum length for the adaptive least mean square (LMS) filter. In almost all papers published in this field, the length of the adaptive filter is maintained constant and the values of the coefficients are modified such that the output mean squared error (MSE) is minimized. There are some practical applications where we need to have information about the length of the optimum Wiener solution. As an example in system identification, one needs to have not only accurate approximation of the coefficient values but also the number of the coefficients of the unknown system. Here we provide the theoretical analysis of the LMS algorithm where the length mismatch between the adaptive filter and the unknown filter is taken into account. Based on this theoretical analysis a new variable length LMS algorithm is introduced.

47 citations

Journal ArticleDOI
TL;DR: A class of iterative methods for solving nonlinear least squares problems is presented and a convergence theorem with error estimations is proved and the theorem is specialized for the case of the Gauss-Newton method.
Abstract: : A class of iterative methods for solving nonlinear least squares problems is presented and a convergence theorem with error estimations is proved. The theorem is then specialized for the case of the Gauss-Newton method, and an algorithm for checking automatically the conditions for convergence is included. Some numerical examples are discussed. (Author)

47 citations


Network Information
Related Topics (5)
Control theory
299.6K papers, 3.1M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Wireless sensor network
142K papers, 2.4M citations
85% related
Wireless
133.4K papers, 1.9M citations
85% related
Feature extraction
111.8K papers, 2.1M citations
85% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202356
2022104
2021172
2020228
2019234
2018237