Simultaneously Sparse and Low-Rank Abundance Matrix Estimation for Hyperspectral Image Unmixing
Citations
132 citations
Cites background from "Simultaneously Sparse and Low-Rank ..."
...observation, the works [220]–[223], [231] consider the re-...
[...]
90 citations
Cites methods from "Simultaneously Sparse and Low-Rank ..."
...In this section, we quantitatively and visually evaluate the unmixing performance of the proposed SULoRA on a synthetic dataset presented in [14] and two real datasets over the areas of Urban and MUFFLE Gulfport Campus, in comparison with eight classical and state-of-the-art methods, including FCLSU, PCLSU, SPCLSU, SUnSAL, SSUnSAL (scaled SUnSAL), SLRU (sparse and low-rank unmixing) [38], PLMM and ELMM....
[...]
79 citations
Cites background or methods from "Simultaneously Sparse and Low-Rank ..."
...straint becomes a powerful unmixing scheme, leading to many state-of-the-art algorithms (see [8], [10], [12], [18]–[21] and reference therein)....
[...]
...The reweighting strategy is widely used for many practical problems (see [12], [29], [61])....
[...]
...Though theoretical convergence analysis is hard to estimate, a series of research works has numerically shown the remarkable performance of the reweighting 1 in [12] and...
[...]
...On the other hand, a low-rank constraint of the abundance matrix has been increasingly adopted for sparse unmixing, providing a new perspective for spatial correlation [12], [21], [26]–[28], and as well as in other applications, such as compressive sensing [29] and tensor completion [30], [31]....
[...]
...[12] simultaneously impose single sparsity and low rankness on the abundance matrix for pixels lying in the homogeneous regions of HSIs....
[...]
68 citations
56 citations
References
[...]
34,729 citations
"Simultaneously Sparse and Low-Rank ..." refers background in this paper
..., 2019 DRAFT 11 Concerning the computational complexity of IPSpLRU, the most complex step is the SVD of the abundance matrix Wt, which takes place in each iteration and is of the order of O(KN2+ K3), [32]. When the endmembers’ dictionary is ill-conditioned(which is a very usual situation in hyperspectral unmixing applications due to the high correlation of endmembers signatures), convergence of Wt (fr...
[...]
17,433 citations
"Simultaneously Sparse and Low-Rank ..." refers background or methods in this paper
...holds for the primal and dual residuals, where ζ = √ (3N + L)Kζ [13] (the relative tolerance ζ > 0 takes its value depending on the application and, in our experimental study, has been empirically determined to 10−4), or the maximum number of iterations is reached....
[...]
...To proceed, we utilize the auxiliary matrix variables Ω1, Ω2, Ω3, and Ω4 of proper dimensions (similar to [11] and [24]) and reformulate the original problem (P2) into its equivalent ADMM form [13], i....
[...]
...At the final step of the proposed method, the scaled Lagrange multipliers in Λ are sequentially updated by performing gradient ascent on the dual problem [13], as follows:...
[...]
...The first algorithm comes from the family of incremental proximal algorithms, which was recently presented and analyzed in [19], and makes use of the proximal operators of all the terms appearing in (P2), whereas the second algorithm exploits the splitting strategy of the ADMM philosophy [13]....
[...]
...Minimization of the resulting regularized cost function is performed by an alternating direction method of multipliers (ADMM) [13]....
[...]
6,765 citations
"Simultaneously Sparse and Low-Rank ..." refers background in this paper
...As is widely known [25], [26], [29], proper selection of these parameters is quite crucial as for the accuracy of the estimations....
[...]
...It is thus empirically verified that the enhanced efficiency of the reweighted 1 and nuclear norms, emphatically advocated in [25]–[27], is retained when using the sum of these two norms....
[...]
4,869 citations
"Simultaneously Sparse and Low-Rank ..." refers background in this paper
...As is widely known [25], [26], [29], proper selection of these parameters is quite crucial as for the accuracy of the estimations....
[...]
...4 Nevertheless, numerous research works advocate the positive impact of these nonconvex weighted norms on the performance of general constrained estimation tasks [26], [27], [29], [35] as well as in hyperspectral unmixing [36], [37]....
[...]
...Additionally, the reweighting norm minimization problem is known to be inherently nonconvex [26], whereas its theoretical convergence analysis for these cases is difficult to be established....
[...]
[...]
3,627 citations
"Simultaneously Sparse and Low-Rank ..." refers background in this paper
...Let us first recall that the proximal operator of a function f(·) is defined as [31], [32]...
[...]