scispace - formally typeset
Open AccessJournal ArticleDOI

Expectation-Maximization Gaussian-Mixture Approximate Message Passing

Jeremy Vila, +1 more
- 01 Oct 2013 - 
- Vol. 61, Iss: 19, pp 4658-4672
Reads0
Chats0
TLDR
An empirical-Bayesian technique is proposed that simultaneously learns the signal distribution while MMSE-recovering the signal-according to the learned distribution-using AMP, and model the non-zero distribution as a Gaussian mixture, and learn its parameters through expectation maximization, using AMP to implement the expectation step.
Abstract
When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signal's non-zero coefficients can have a profound effect on recovery mean-squared error (MSE). If this distribution was a priori known, then one could use computationally efficient approximate message passing (AMP) techniques for nearly minimum MSE (MMSE) recovery. In practice, however, the distribution is unknown, motivating the use of robust algorithms like LASSO-which is nearly minimax optimal-at the cost of significantly larger MSE for non-least-favorable distributions. As an alternative, we propose an empirical-Bayesian technique that simultaneously learns the signal distribution while MMSE-recovering the signal-according to the learned distribution-using AMP. In particular, we model the non-zero distribution as a Gaussian mixture and learn its parameters through expectation maximization, using AMP to implement the expectation step. Numerical experiments on a wide range of signal classes confirm the state-of-the-art performance of our approach, in both reconstruction error and runtime, in the high-dimensional regime, for most (but not all) sensing operators.

read more

Citations
More filters
Proceedings ArticleDOI

Correlation learning on joint support recovery for more sources than measurements

TL;DR: This work first analyzes the MSBL algorithm by Wipf and Rao, and provides its limit in the M ≤ D regime, and further develops Bayesian methods for learning the correlation structures both temporally or spatially.
Dissertation

Algorithms for 3D time-of-flight imaging

Jonathan Mei
TL;DR: This thesis describes the design and implementation of two novel frameworks and processing schemes for 3D imaging based on time-of-flight (TOF) principles that incorporate both accurate probabilistic modeling for the measurement process and underlying scene depth map sparsity to accurately extend the unambiguous depth range of the camera.
Proceedings ArticleDOI

YAMPA: Yet Another Matching Pursuit Algorithm for compressive sensing

TL;DR: This paper formulates an iterative algorithm, termed yet another matching pursuit algorithm (YAMPA), for recovery of sparse signals from compressive measurements, and shows that YAMPA uniformly outperforms other pursuit algorithms for the case of thresholding parameters chosen in a clairvoyant fashion.
Journal ArticleDOI

Joint Activity and Channel Estimation for Extra-Large MIMO Systems

TL;DR: In this article , a bilinear Bayesian inference algorithm was proposed to jointly estimate the associated channel coefficients, user activity patterns, and sub-array activity patterns in XL-MIMO systems.
Journal ArticleDOI

Influence of multi-walled carbon nanotubes on the fracture response and phase distribution of metakaolin-based potassium geopolymers

TL;DR: In this paper, the effects of multi-walled carbon nanotubes (MWCNTs) on the chemistry, microstructure, phase distribution, and fracture response of potassium-based metakaolin geopolymers at the microscopic scale were investigated.
References
More filters
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Journal ArticleDOI

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

TL;DR: A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Journal ArticleDOI

Atomic Decomposition by Basis Pursuit

TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Journal ArticleDOI

Sparse bayesian learning and the relevance vector machine

TL;DR: It is demonstrated that by exploiting a probabilistic Bayesian learning framework, the 'relevance vector machine' (RVM) can derive accurate prediction models which typically utilise dramatically fewer basis functions than a comparable SVM while offering a number of additional advantages.