scispace - formally typeset
Search or ask a question
Topic

Recursive least squares filter

About: Recursive least squares filter is a research topic. Over the lifetime, 8907 publications have been published within this topic receiving 191933 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: Two recursive least squares parameter estimation algorithms are proposed by using the data filtering technique and the Auxiliary model identification idea to identify the parameters of the system models and the noise models interactively and can generate more accurate parameter estimates than the auxiliary model based recursion least squares algorithms.

137 citations

Journal ArticleDOI
TL;DR: This paper shows how the least squares lattice algorithms, recently introduced by Morf and Lee, can be adapted to the equalizer adjustment algorithm, which has a number of desirable features which should prove useful in many applications.
Abstract: In many applications of adaptive data equalization, rapid initial convergence of the adaptive equalizer is of paramount importance. Apparently, the fastest known equalizer adaptation algorithm is based on a recursive least squares estimation algorithm. In this paper we show how the least squares lattice algorithms, recently introduced by Morf and Lee, can be adapted to the equalizer adjustment algorithm. The resulting algorithm, although computationally more complex than certain other equalizer algorithms (including the fast Kalman algorithm), has a number of desirable features which should prove useful in many applications.

136 citations

Proceedings ArticleDOI
01 Dec 2007
TL;DR: New and improved algorithms for the least-squares NNMA problem are presented which are not only theoretically well-founded, but also overcome many of the deficiencies of other methods, and use non-diagonal gradient scaling to obtain rapid convergence.
Abstract: Nonnegative Matrix Approximation is an effective matrix decomposition technique that has proven to be useful for a wide variety of applications ranging from document analysis and image processing to bioinformatics. There exist a few algorithms for nonnegative matrix approximation (NNMA), for example, Lee & Seung’s multiplicative updates, alternating least squares, and certain gradient descent based procedures. All of these procedures suffer from either slow convergence, numerical instabilities, or at worst, theoretical unsoundness. In this paper we present new and improved algorithms for the least-squares NNMA problem, which are not only theoretically well-founded, but also overcome many of the deficiencies of other methods. In particular, we use non-diagonal gradient scaling to obtain rapid convergence. Our methods provide numerical results superior to both Lee & Seung’s method as well to the alternating least squares (ALS) heuristic, which is known to work well in some situations but has no theoretical guarantees (Berry et al. 2006). Our approach extends naturally to include regularization and box-constraints, without sacrificing convergence guarantees. We present experimental results on both synthetic and realworld datasets to demonstrate the superiority of our methods, in terms of better approximations as well as efficiency.

136 citations

Journal ArticleDOI
TL;DR: This new recursive least-squares (RLS) estimation algorithm has a computational complexity similar to the conventional RLS algorithm, but is more robust to roundoff errors and has a highly modular structure, suitable for VLSI implementation.
Abstract: This paper presents a recursive form of the modified Gram-Schmidt algorithm (RMGS). This new recursive least-squares (RLS) estimation algorithm has a computational complexity similar to the conventional RLS algorithm, but is more robust to roundoff errors and has a highly modular structure, suitable for VLSI implementation. Its properties and features are discussed and compared to other LS estimation algorithms.

135 citations

Journal ArticleDOI
TL;DR: Simulations under different scenarios demonstrate that this technique for implementing a quadratic inequality constraint with recursive least squares (RLS) updating has better interference suppression than both the RLS beamformer with no quadRatic constraint and the R LS beamformer using the scaled projection technique, as well as faster convergence than LMS beamformers.
Abstract: Quadratic constraints on the weight vector of an adaptive linearly constrained minimum power (LCMP) beamformer can improve robustness to pointing errors and to random perturbations in sensor parameters. We propose a technique for implementing a quadratic inequality constraint with recursive least squares (RLS) updating. A variable diagonal loading term is added at each step, where the amount of loading has a closed-form solution. Simulations under different scenarios demonstrate that this algorithm has better interference suppression than both the RLS beamformer with no quadratic constraint and the RLS beamformer using the scaled projection technique, as well as faster convergence than LMS beamformers.

135 citations


Network Information
Related Topics (5)
Control theory
299.6K papers, 3.1M citations
88% related
Optimization problem
96.4K papers, 2.1M citations
88% related
Wireless sensor network
142K papers, 2.4M citations
85% related
Wireless
133.4K papers, 1.9M citations
85% related
Feature extraction
111.8K papers, 2.1M citations
85% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202356
2022104
2021172
2020228
2019234
2018237