Expectation-Maximization Gaussian-Mixture Approximate Message Passing
Citations
395 citations
327 citations
320 citations
319 citations
303 citations
Cites methods from "Expectation-Maximization Gaussian-M..."
...The parameters η and σ(2) L can be learned by EM-GAMP algorithm [22] if they are unknown....
[...]
References
40,785 citations
"Expectation-Maximization Gaussian-M..." refers background in this paper
...olynomial-complexity algorithms when x is sufficiently sparse and when A satisfies certain restricted isometry properties [4], or when A is large with i.i.d random entries [5] as discussed below. Lasso [6] (or, equivalently, Basis Pursuit Denoising [7]), is a well-known approach to the sparse-signal recovery problem that solves the convex problem xˆlasso = argmin xˆ ky −Axˆk2 2 +λlassokxˆk1, (1) with λ...
[...]
11,413 citations
"Expectation-Maximization Gaussian-M..." refers methods or result in this paper
... tried SPGL1 but found performance degradations at small M. 11For FISTA, we used the regularization parameter λ FISTA =10 −5, which is consistent with the values used for the noiseless experiments in [26]. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 M/N K/M EM-GM-AMP-MOS EM-GM-AMP EM-BG-AMP genie GM-AMP DMM-AMP theoretical LASSO Fig. 4. Empirical PTCs and LASSO theoretical ...
[...]
...tions of a K-sparse BG signal and an i.i.dN(0,M−1) matrix A. We then recovered x from the noiseless measurements using EM-GM-AMP-MOS, EM-GM-AMP, EM-BGAMP, genie-GM-AMP, and the Lasso-solver10 FISTA11 [26]. Figure 6 shows that the PTCs of EM-GM-AMP-MOS and EMGM-AMP are nearly identical, slightly better than those of EM-BG-AMP and genie-GM-AMP (especially at very small M), and much better than FISTA’s. ...
[...]
9,950 citations
"Expectation-Maximization Gaussian-M..." refers background in this paper
...nd when A satisfies certain restricted isometry properties [4], or when A is large with i.i.d zeromean sub-Gaussian entries [5] as discussed below. LASSO [6] (or, equivalently, Basis Pursuit Denoising [7]), is a well-known approach to the sparse-signal recovery problem that solves the convex problem xˆlasso = argmin xˆ ky −Axˆk2 2 +λlassokxˆk1, (1) with λlasso a tuning parameter that trades between th...
[...]
5,116 citations
"Expectation-Maximization Gaussian-M..." refers methods in this paper
...tic unknowns, our proposed EM-GM-AMP algorithm can be classified as an “empirical-Bayesian” approach [16]. Compared with previously proposed empirical-Bayesian approaches to compressive sensing (e.g., [17]–[19]), ours has a more flexible signal model, and thus is able to better match a wide range of signal pdfs pX(·), as we demonstrate through a detailed numerical study. In addition, the complexity scal...
[...]