Least mean squares filter
About: Least mean squares filter is a research topic. Over the lifetime, 6482 publications have been published within this topic receiving 103804 citations. The topic is also known as: Least Mean Square Filter.
Papers published on a yearly basis
01 Jan 1986
TL;DR: In this paper, the authors propose a recursive least square adaptive filter (RLF) based on the Kalman filter, which is used as the unifying base for RLS Filters.
Abstract: Background and Overview. 1. Stochastic Processes and Models. 2. Wiener Filters. 3. Linear Prediction. 4. Method of Steepest Descent. 5. Least-Mean-Square Adaptive Filters. 6. Normalized Least-Mean-Square Adaptive Filters. 7. Transform-Domain and Sub-Band Adaptive Filters. 8. Method of Least Squares. 9. Recursive Least-Square Adaptive Filters. 10. Kalman Filters as the Unifying Bases for RLS Filters. 11. Square-Root Adaptive Filters. 12. Order-Recursive Adaptive Filters. 13. Finite-Precision Effects. 14. Tracking of Time-Varying Systems. 15. Adaptive Filters Using Infinite-Duration Impulse Response Structures. 16. Blind Deconvolution. 17. Back-Propagation Learning. Epilogue. Appendix A. Complex Variables. Appendix B. Differentiation with Respect to a Vector. Appendix C. Method of Lagrange Multipliers. Appendix D. Estimation Theory. Appendix E. Eigenanalysis. Appendix F. Rotations and Reflections. Appendix G. Complex Wishart Distribution. Glossary. Abbreviations. Principal Symbols. Bibliography. Index.
01 Jan 1985
TL;DR: This chapter discusses Adaptive Arrays and Adaptive Beamforming, as well as other Adaptive Algorithms and Structures, and discusses the Z-Transform in Adaptive Signal Processing.
Abstract: GENERAL INTRODUCTION. Adaptive Systems. The Adaptive Linear Combiner. THEORY OF ADAPTATION WITH STATIONARY SIGNALS. Properties of the Quadratic Performance Surface. Searching the Performance Surface. Gradient Estimation and Its Effects on Adaptation. ADAPTIVE ALGORITHMS AND STRUCTURES. The LMS Algorithm. The Z-Transform in Adaptive Signal Processing. Other Adaptive Algorithms and Structures. Adaptive Lattice Filters. APPLICATIONS. Adaptive Modeling and System Identification. Inverse Adaptive Modeling, Deconvolution, and Equalization. Adaptive Control Systems. Adaptive Interference Cancelling. Introduction to Adaptive Arrays and Adaptive Beamforming. Analysis of Adaptive Beamformers.
01 Jan 1989
TL;DR: The author explains the development of the Wiener Solution and some of the techniques used in its implementation, including Optimum Processing: Steady State Performance and theWiener Solution, which simplifies the implementation of the Covariance Matrix.
Abstract: 1 Introduction.- 1.1 Introduction.- 1.2 Organization of the Book.- 1.3 Notations and Preliminaries.- 2 Detection of Multiple Signals.- 2.1 Signals and Noise.- 2.2 Conventional Techniques.- 2.2.1 Beamformer.- 2.2.2 Capon's Minimum Variance Estimator.- 2.2.3 Linear Prediction Method.- 2.3 Eigenvector-Based Techniques.- 2.3.1 Completely Coherent Case.- 2.3.2 Symmetric Array Scheme: Coherent Sources in a Correlated Scene.- 2.3.3 Spatial Smoothing Schemes: Direction Finding in a Coherent Environment.- 2.4 Augmentation and Other Techniques.- 2.4.1 Augmentation Technique.- 2.4.2 ESPRIT, TLS-ESPRIT and GEESE.- 2.4.3 Direction Finding Using First Order Statistics.- Appendix 2.A Coherent and Correlated Signal Scene.- Appendix 2.B Program Listings.- Problems.- References.- 3 Performance Analysis.- 3.1 Introduction.- 3.2 The Maximum Likelihood Estimate of the Covariance Matrix and Some Related Distributions.- 3.3 Performance Analysis of Covariance Based Eigenvector Techniques: MUSIC and Spatial Smoothing Schemes.- 3.3.1 Asymptotic Distribution of Eigenparameters Associated with Smoothed Sample Covariance Matrices.- 3.3.2 Two-Source Case - Uncorrelated and Coherent Scene.- 3.4 Performance Evaluation of GEESE Scheme.- 3.4.1 The Least Favorable Configuration (J = K).- 3.4.2 The Most Favorable Configuration (J = M - 1).- 3.5 Estimation of Number of Signals.- Appendix 3.A The Complex Wishart Distribution.- Appendix 3.B Equivalence of Eigenvectors.- Appendix 3.C Eigenparameters in a Two Source Case.- Problems.- References.- 4 Estimation of Multiple Signals.- 4.1 Introduction.- 4.2 Optimum Processing: Steady State Performance and the Wiener Solution.- 4.3 Implementation of the Wiener Solution.- 4.3.1 The Method of Steepest Descent.- 4.3.2 The Least Mean Square (LMS) Algorithm.- 4.3.3 Direct Implementation by Inversion of the Sample Covariance Matrix.- Problems.- References.
TL;DR: Closed-form expressions that describe the network performance in terms of mean-square error quantities are derived and the resulting algorithm is distributed, cooperative and able to respond in real time to changes in the environment.
Abstract: We formulate and study distributed estimation algorithms based on diffusion protocols to implement cooperation among individual adaptive nodes. The individual nodes are equipped with local learning abilities. They derive local estimates for the parameter of interest and share information with their neighbors only, giving rise to peer-to-peer protocols. The resulting algorithm is distributed, cooperative and able to respond in real time to changes in the environment. It improves performance in terms of transient and steady-state mean-square error, as compared with traditional noncooperative schemes. Closed-form expressions that describe the network performance in terms of mean-square error quantities are derived, presenting a very good match with simulations.
TL;DR: A least-mean-square adaptive filter with a variable step size, allowing the adaptive filter to track changes in the system as well as produce a small steady state error, is introduced.
Abstract: A least-mean-square (LMS) adaptive filter with a variable step size is introduced. The step size increases or decreases as the mean-square error increases or decreases, allowing the adaptive filter to track changes in the system as well as produce a small steady state error. The convergence and steady-state behavior of the algorithm are analyzed. The results reduce to well-known results when specialized to the constant-step-size case. Simulation results are presented to support the analysis and to compare the performance of the algorithm with the usual LMS algorithm and another variable-step-size algorithm. They show that its performance compares favorably with these existing algorithms. >
Trending Questions (6)