Proceedings ArticleDOI
Thresholding-based online algorithms of complexity comparable to sparse LMS methods
Yannis Kopsinis,Konstantinos Slavakis,Sergios Theodoridis,Stephen McLaughlin +3 more
- pp 513-516
Reads0
Chats0
TLDR
A novel class of set-theoretic adaptive sparsity promoting algorithms of linear computational complexity induced via generalized thresholding operators, which correspond to nonconvex penalties such as those used in a number of sparse LMS based schemes.Abstract:
This paper deals with a novel class of set-theoretic adaptive sparsity promoting algorithms of linear computational complexity. Sparsity is induced via generalized thresholding operators, which correspond to nonconvex penalties such as those used in a number of sparse LMS based schemes. The results demonstrate the significant performance gain of our approach, at comparable computational cost.read more
Citations
More filters
Journal ArticleDOI
Generalized Thresholding and Online Sparsity-Aware Learning in a Union of Subspaces
TL;DR: In this paper, a generalized thresholding (GT) operator, which relates to both convex and non-convex penalty functions, is introduced, in a unified way, the majority of well-known thresholding rules which promote sparsity.
Posted Content
Generalized Thresholding and Online Sparsity-Aware Learning in a Union of Subspaces
TL;DR: A generalized thresholding operator, which relates to both convex and non-convex penalty functions, is introduced and a novel family of partially quasi-nonexpansive mappings is introduced as a functional analytic tool for treating the GT operator.
Proceedings ArticleDOI
Sparsity-aware learning in the context of echo cancelation: A set theoretic estimation approach
TL;DR: The source of this malfunction is investigated, a solution complying with the set-theoretic philosophy is proposed and the new algorithm is evaluated in realistic echo-cancellation scenarios and compared with state-of-the-art methods for echo cancellation such as the IPNLMS and IPAPA algorithms.
Journal ArticleDOI
Sparsity-Aware Adaptive Learning: A Set Theoretic Estimation Approach.
TL;DR: A recent family of schemes, which build upon convex analytic tools, can easily deal with the existence of a set of convex constraints and also to bypass the need of differentiability of cost functions.
Posted Content
Generalized Thresholding Sparsity-Aware Online Learning in a Union of Subspaces
TL;DR: A new theory is developed that allows for the incorporation, in a unifying way, of different thresholding rules to promote sparsity, that may be even related to non-convex penalty functions.
References
More filters
Book
Compressed sensing
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Journal ArticleDOI
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Jianqing Fan,Runze Li +1 more
TL;DR: In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
Book
Fundamentals of adaptive filtering
TL;DR: This paper presents a meta-anatomy of Adaptive Filters, a system of filters and algorithms that automates the very labor-intensive and therefore time-heavy and expensive process of designing and implementing these filters.
Proceedings ArticleDOI
Sparse LMS for system identification
TL;DR: A new approach to adaptive system identification when the system model is sparse is proposed, which results in a zero-attracting LMS and a reweighted zero attractor, and it is proved that the ZA-LMS can achieve lower mean square error than the standard LMS.