C-HiLasso: A Collaborative Hierarchical Sparse Modeling Framework
TLDR
This work combines the sparsity-inducingproperty of the Lasso at the individual feature level, with the block-sparsity property of the Group Lasso, where sparse groups of features are jointly encoded, obtaining a sparsity pattern hierarchically structured, resulting in the Hierarchical Lasso (HiLasso), which shows important practical advantages.Abstract:
Sparse modeling is a powerful framework for data analysis and processing. Traditionally, encoding in this framework is performed by solving an l1-regularized linear regression problem, commonly referred to as Lasso or Basis Pursuit. In this work we combine the sparsity-inducing property of the Lasso at the individual feature level, with the block-sparsity property of the Group Lasso, where sparse groups of features are jointly encoded, obtaining a sparsity pattern hierarchically structured. This results in the Hierarchical Lasso (HiLasso), which shows important practical advantages. We then extend this approach to the collaborative case, where a set of simultaneously coded signals share the same sparsity pattern at the higher (group) level, but not necessarily at the lower (inside the group) level, obtaining the collaborative HiLasso model (C-HiLasso). Such signals then share the same active groups, or classes, but not necessarily the same active set. This model is very well suited for applications such as source identification and separation. An efficient optimization procedure, which guarantees convergence to the global optimum, is developed for these new models. The underlying presentation of the framework and optimization approach is complemented by experimental examples and theoretical results regarding recovery guarantees.read more
Citations
More filters
Journal ArticleDOI
Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches
Jose M. Bioucas-Dias,Antonio Plaza,Nicolas Dobigeon,Mario Parente,Qian Du,Paul D. Gader,Jocelyn Chanussot +6 more
TL;DR: This paper presents an overview of un Mixing methods from the time of Keshava and Mustard's unmixing tutorial to the present, including Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixed algorithms.
Posted Content
Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches
Jose M. Bioucas-Dias,Antonio Plaza,Nicolas Dobigeon,Mario Parente,Qian Du,Paul D. Gader,Jocelyn Chanussot +6 more
TL;DR: An overview of unmixing methods from the time of Keshava and Mustard's tutorial as mentioned in this paper to the present can be found in Section 2.2.1].
Journal ArticleDOI
Structured Compressed Sensing: From Theory to Applications
Marco F. Duarte,Yonina C. Eldar +1 more
TL;DR: The prime focus is bridging theory and practice, to pinpoint the potential of structured CS strategies to emerge from the math to the hardware in compressive sensing.
Posted Content
Optimization with Sparsity-Inducing Penalties
TL;DR: In this article, the authors present from a general perspective optimization tools and techniques dedicated to such sparsityinducing penalties, including proximal methods, block-coordinate descent, reweighted $\ell_2$-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions.
Book
Optimization with Sparsity-Inducing Penalties
TL;DR: This monograph covers proximal methods, block-coordinate descent, reweighted l2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view.
References
More filters
Journal ArticleDOI
Regression Shrinkage and Selection via the Lasso
TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book
Compressed sensing
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Journal ArticleDOI
Atomic Decomposition by Basis Pursuit
TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Journal ArticleDOI
Robust Face Recognition via Sparse Representation
TL;DR: This work considers the problem of automatically recognizing human faces from frontal views with varying expression and illumination, as well as occlusion and disguise, and proposes a general classification algorithm for (image-based) object recognition based on a sparse representation computed by C1-minimization.