Topic
Recursive least squares filter
About: Recursive least squares filter is a research topic. Over the lifetime, 8907 publications have been published within this topic receiving 191933 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: Fast transversal and lattice least squares algorithms for adaptive multichannel filtering and system identification are developed and can be viewed as fast realizations of the recursive prediction error algorithm.
Abstract: Fast transversal and lattice least squares algorithms for adaptive multichannel filtering and system identification are developed. Models with different orders for input and output channels are allowed. Four topics are considered: multichannel FIR filtering, rational IIR filtering, ARX multichannel system identification, and general linear system identification possessing a certain shift invariance structure. The resulting algorithms can be viewed as fast realizations of the recursive prediction error algorithm. Computational complexity is then reduced by an order of magnitude as compared to standard recursive least squares and stochastic Gauss-Newton methods. The proposed transversal and lattice algorithms rely on suitable order step-up-step-down updating procedures for the computation of the Kalman gain. Stabilizing feedback for the control of numerical errors together with long run simulations are included. >
50 citations
••
TL;DR: A number of new nonlinear algorithms that can be used independently as predictors or as interference identifiers so that the ACM or the DDK filter can be applied and outperform conventional ones are proposed.
Abstract: It has been shown that the narrowband (NB) interference suppression capability of a direct-sequence (DS) spread spectrum system can be enhanced considerably by processing the received signal via a prediction error filter. The conventional approach to this problem makes use of a linear filter. However, the binary DS signal, that acts as noise in the prediction process, is highly non-Gaussian. Thus, linear filtering is not optimal. Vijayan and Poor (1990) first proposed using a nonlinear approximate conditional mean (ACM) filter of the Masreliez (1975) type and obtained significant results. This paper proposes a number of new nonlinear algorithms. Our work consists of three parts. (1) We develop a decision-directed Kalman (DDK) filter, that has the same performance as the ACM filter but a simpler structure. (2) Using the nonlinear function in the ACM and the DDK filters, we develop other nonlinear least mean square (LMS) filters with improved performance. (3) We further use the nonlinear functions to develop nonlinear recursive least squares (RLS) filters that can be used independently as predictors or as interference identifiers so that the ACM or the DDK filter can be applied. Simulations show that our nonlinear algorithms outperform conventional ones.
50 citations
••
14 Mar 2010TL;DR: It is shown that a closed form, recursive solution of the filter weights using this criterion yields a simple weighted least squares like formulation, and is much faster than gradient based training, and more accurate than the RLS algorithm in cases where the error pdf is non-Gaussian and heavy tailed.
Abstract: This paper presents a closed form recursive solution for training adaptive filters using the Maximum Correntropy Criterion (MCC) Correntropy has been recently proposed as a robust similarity measure between two random variables or signals, when the pdf s involved are heavy tailed and non-Gaussian Maximizing the cross-correntropy between the output of an adaptive filter and the desired response leads to the Maximum Correntropy Criterion for adaptive systems training We show that a closed form, recursive solution of the filter weights using this criterion yields a simple weighted least squares like formulation Our simulations show that training the filter weights using this recursive solution is much faster than gradient based training, and more accurate than the RLS algorithm in cases where the error pdf is non-Gaussian and heavy tailed
50 citations
••
TL;DR: The purpose of this paper is to introduce a novel Gram-Schmidt orthogonalization predictor realization, and to present an adaptive algorithm to update its coefficients (weights), along with corresponding results obtained via some existing adaptive predictor algorithms.
Abstract: The purpose of this paper is to introduce a novel Gram-Schmidt orthogonalization predictor realization, and also to present an adaptive algorithm to update its coefficients (weights). Experimental results pertaining to this algorithm are included, along with corresponding results obtained via some existing adaptive predictor algorithms.
50 citations
••
TL;DR: This paper discusses in detail a recently proposed kernel-based version of the recursive least-squares (RLS) algorithm for fast adaptive nonlinear filtering that combines a sliding-window approach with conventional ridge regression to improve generalization.
Abstract: In this paper we discuss in detail a recently proposed kernel-based version of the recursive least-squares (RLS) algorithm for fast adaptive nonlinear filtering. Unlike other previous approaches, the studied method combines a sliding-window approach (to fix the dimensions of the kernel matrix) with conventional ridge regression (to improve generalization). The resulting kernel RLS algorithm is applied to several nonlinear system identification problems. Experiments show that the proposed algorithm is able to operate in a time-varying environment and to adjust to abrupt changes in either the linear filter or the nonlinearity.
49 citations