scispace - formally typeset
Search or ask a question
Topic

Recursive least squares filter

About: Recursive least squares filter is a research topic. Over the lifetime, 8907 publications have been published within this topic receiving 191933 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This note deals with the recursive parameter identification of Hammerstein systems with discontinuous nonlinearities, i.e., two-segment piecewise-linear with dead-zones and preloads.
Abstract: This note deals with the recursive parameter identification of Hammerstein systems with discontinuous nonlinearities, i.e., two-segment piecewise-linear with dead-zones and preloads. A special form of the Hammerstein model with this type of nonlinearity is incorporated into the recursive least squares identification scheme supplemented with the estimation of model internal variables. The proposed method is illustrated by examples.

179 citations

Journal ArticleDOI
TL;DR: By incorporating a simple online vector quantization method, a recursive algorithm is derived to update the solution, namely the quantized kernel recursive least squares algorithm.
Abstract: In a recent paper, we developed a novel quantized kernel least mean square algorithm, in which the input space is quantized (partitioned into smaller regions) and the network size is upper bounded by the quantization codebook size (number of the regions). In this paper, we propose the quantized kernel least squares regression, and derive the optimal solution. By incorporating a simple online vector quantization method, we derive a recursive algorithm to update the solution, namely the quantized kernel recursive least squares algorithm. The good performance of the new algorithm is demonstrated by Monte Carlo simulations.

178 citations

Journal ArticleDOI
TL;DR: A new control mechanism for the variable forgetting factor (VFF) of the recursive least square (RLS) adaptive algorithm is presented, which is basically a gradient-based method of which the gradient is derived from an improved mean square error analysis of RLS.
Abstract: In this paper, a new control mechanism for the variable forgetting factor (VFF) of the recursive least square (RLS) adaptive algorithm is presented. The control algorithm is basically a gradient-based method of which the gradient is derived from an improved mean square error analysis of RLS. The new mean square error analysis exploits the correlation of the inverse of the correlation matrix with itself that yields improved theoretical results, especially in the transient and steady-state mean square error. It is shown that the theoretical analysis is close to simulation results for different forgetting factors and different model orders. The analysis yields a dynamic equation of mean square error that can be used to derive a dynamic equation of the gradient of mean square error to control the forgetting factor. The dynamic equation can produce a positive gradient when the error is large and a negative gradient when the error is in the steady state. Compared with other variable forgetting factor algorithms, the new control algorithm gives fast tracking and small mean square model error for different signal-to-noise ratios (SNRs).

178 citations

Book
01 Apr 1995
TL;DR: Transmission Systems Theory of Adaptive Transversal Filters Implementation Considerations Tracking the Time-Variations Adaptive Recursive (IIR) Filters The Case of Independent Input Vectors.
Abstract: Transmission Systems Theory of Adaptive Transversal (FIR) Filters Implementation Considerations Tracking the Time-Variations Adaptive Recursive (IIR) Filters The Case of Independent Input Vectors

177 citations

Journal ArticleDOI
TL;DR: It is shown here, however, that for an important class of nonstationary problems, the mis adjustment of conventional LMS is the same as that of orthogonalized LMS, which in the stationary case is shown to perform essentially as an exact least squares algorithm.
Abstract: A fundamental relationship exists between the quality of an adaptive solution and the amount of data used in obtaining it. Quality is defined here in terms of "misadjustment," the ratio of the excess mean square error (mse) in an adaptive solution to the minimum possible mse. The higher the misadjustment, the lower the quality is. The quality of the exact least squares solution is compared with the quality of the solutions obtained by the orthogonalized and the conventional least mean square (LMS) algorithms with stationary and nonstationary input data. When adapting with noisy observations, a filter trained with a finite data sample using an exact least squares algorithms will have a misadjustment given by M=\frac{n}{N}=\frac{number of weights}{number of training samples} If the same adaptive filter were trained with a steady flow of data using an ideal "orthogonalized LMS" algorithm, the misadjustment would be M=\frac{n}{4\tau_{\mse}}=\frac{number of weights}{number of training samples} Thus, for a given time constant \tau_{\mse} of the learning process, the ideal orthogonalized LMS algorithm will have about as low a misadjustment as can be achieved, since this algorithm performs essentially as an exact least squares algorithm with exponential data weighting. It is well known that when rapid convergence with stationary data is required, exact least squares algorithms can in certain cases outperform the conventional Widrow-Hoff LMS algorithm. It is shown here, however, that for an important class of nonstationary problems, the misadjustment of conventional LMS is the same as that of orthogonalized LMS, which in the stationary case is shown to perform essentially as an exact least squares algorithm.

175 citations


Network Information
Related Topics (5)
Control theory
299.6K papers, 3.1M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Wireless sensor network
142K papers, 2.4M citations
85% related
Wireless
133.4K papers, 1.9M citations
85% related
Feature extraction
111.8K papers, 2.1M citations
85% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202356
2022104
2021172
2020228
2019234
2018237