scispace - formally typeset
Open AccessJournal ArticleDOI

Simultaneously Sparse and Low-Rank Abundance Matrix Estimation for Hyperspectral Image Unmixing

Reads0
Chats0
TLDR
Two novel unmixing algorithms are introduced in an attempt to exploit both spatial correlation and sparse representation of pixels lying in the homogeneous regions of hyperspectral images and are illustrated in experiments conducted both on simulated and real data.
Abstract
In a plethora of applications dealing with inverse problems, e.g., image processing, social networks, compressive sensing, and biological data processing, the signal of interest is known to be structured in several ways at the same time. This premise has recently guided research into the innovative and meaningful idea of imposing multiple constraints on the unknown parameters involved in the problem under study. For instance, when dealing with problems whose unknown parameters form sparse and low-rank matrices, the adoption of suitably combined constraints imposing sparsity and low rankness is expected to yield substantially enhanced estimation results. In this paper, we address the spectral unmixing problem in hyperspectral images. Specifically, two novel unmixing algorithms are introduced in an attempt to exploit both spatial correlation and sparse representation of pixels lying in the homogeneous regions of hyperspectral images. To this end, a novel mixed penalty term is first defined consisting of the sum of the weighted $\ell_{1}$ and the weighted nuclear norm of the abundance matrix corresponding to a small area of the image determined by a sliding square window. This penalty term is then used to regularize a conventional quadratic cost function and impose simultaneous sparsity and low rankness on the abundance matrix. The resulting regularized cost function is minimized by: 1) an incremental proximal sparse and low-rank unmixing algorithm; and 2) an algorithm based on the alternating direction method of multipliers . The effectiveness of the proposed algorithms is illustrated in experiments conducted both on simulated and real data.

read more

Citations
More filters
Proceedings ArticleDOI

Superpixel Based Low-Rank Sparse Unmixing for Hyperspectral Remote Sensing Image

TL;DR: In this article, a superpixel based low-rank sparse unmixing (SpLRSU) algorithm is proposed, which encourages the local spatial consistency and the spatial continuity of the image.
Proceedings ArticleDOI

Enhancing Reweighted Low-Rank Representation for Hyperspectral Image Unmixing

TL;DR: Wang et al. as mentioned in this paper proposed a weighted nuclear norm regularization to enhance the sparsity of the singular values of the abundance matrix, which considers information of all singular values, instead of particular singular value only.
Proceedings ArticleDOI

A Bayesian Model for Joint Unmixing and Robust Classification of Hyperspectral Images

TL;DR: A new hierarchical Bayesian model to perform simultaneously both analysis in both supervised classification and spectral unmixing in order to ensure that they benefit from each other.
Journal ArticleDOI

Two-step iterative row-sparsity hyperspectral unmixing via low-rank constraint

TL;DR: A novel algorithm called two-step iterative row-sparsity hyperspectral unmixing via a low-rank constraint (TRSUnLR) is proposed, which introduces a row-hard-threshold function to solve the l2,0 norm directly.
Journal ArticleDOI

Low-Rank and Spectral-Spatial Sparse Unmixing for Hyperspectral Remote Sensing Imagery

TL;DR: In this paper, the spatial weights are incorporated into the collaborative sparse regularization term to enhance the spatial continuity of the image, while the global low-rank constraint is employed to maintain the spatial low-dimensional structure.
References
More filters
Book

Matrix computations

Gene H. Golub
Book

Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers

TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Journal ArticleDOI

The adaptive lasso and its oracle properties

TL;DR: A new version of the lasso is proposed, called the adaptive lasso, where adaptive weights are used for penalizing different coefficients in the ℓ1 penalty, and the nonnegative garotte is shown to be consistent for variable selection.
Journal ArticleDOI

Enhancing Sparsity by Reweighted ℓ 1 Minimization

TL;DR: A novel method for sparse signal recovery that in many situations outperforms ℓ1 minimization in the sense that substantially fewer measurements are needed for exact recovery.
Book

Proximal Algorithms

TL;DR: The many different interpretations of proximal operators and algorithms are discussed, their connections to many other topics in optimization and applied mathematics are described, some popular algorithms are surveyed, and a large number of examples of proxiesimal operators that commonly arise in practice are provided.
Related Papers (5)