scispace - formally typeset
Journal ArticleDOI

Bayesian Compressive Sensing Using Laplace Priors

Reads0
Chats0
TLDR
This paper model the components of the compressive sensing (CS) problem, i.e., the signal acquisition process, the unknown signal coefficients and the model parameters for the signal and noise using the Bayesian framework and develops a constructive (greedy) algorithm designed for fast reconstruction useful in practical settings.
Abstract
In this paper, we model the components of the compressive sensing (CS) problem, i.e., the signal acquisition process, the unknown signal coefficients and the model parameters for the signal and noise using the Bayesian framework. We utilize a hierarchical form of the Laplace prior to model the sparsity of the unknown signal. We describe the relationship among a number of sparsity priors proposed in the literature, and show the advantages of the proposed model including its high degree of sparsity. Moreover, we show that some of the existing models are special cases of the proposed model. Using our model, we develop a constructive (greedy) algorithm designed for fast reconstruction useful in practical settings. Unlike most existing CS reconstruction methods, the proposed algorithm is fully automated, i.e., the unknown signal coefficients and all necessary parameters are estimated solely from the observation, and, therefore, no user-intervention is needed. Additionally, the proposed algorithm provides estimates of the uncertainty of the reconstructions. We provide experimental results with synthetic 1-D signals and images, and compare with the state-of-the-art CS reconstruction algorithms demonstrating the superior performance of the proposed approach.

read more

Citations
More filters
Journal ArticleDOI

Inducing sparsity via the horseshoe prior in imaging problems

Yiqiu Dong, +1 more
- 24 May 2023 - 
TL;DR: In this article , a hierarchical Bayesian framework with a Maximum A Posteriori (MAP) estimation approach is proposed to solve the problem of image reconstruction under sparsity constraints. But, the model is based on a conditionally Gaussian random variable with an unknown variance and the expected behavior on the variance is encoded in a half-Cauchy hyperprior.
Posted Content

Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-Differentiable Priors.

TL;DR: In this article, piecewise-deterministic Markov process (PDMP) is used for exact posterior inference from distributions satisfying almost everywhere differentiability, and the proposed PDMP-based samplers place no assumptions on the prior shape, and consequently have a much broader scope of application.
Dissertation

Fusión bayesiana de pares de imágenes con diferente exposición

TL;DR: In this article, the authors propose a metodo robusto y automatico de calibración of un par de imagenes of distinta exposicion for the aplicacion posterior del proceso de restauración.
References
More filters
Book

Pattern Recognition and Machine Learning

TL;DR: Probability Distributions, linear models for Regression, Linear Models for Classification, Neural Networks, Graphical Models, Mixture Models and EM, Sampling Methods, Continuous Latent Variables, Sequential Data are studied.
Journal ArticleDOI

Pattern Recognition and Machine Learning

Radford M. Neal
- 01 Aug 2007 - 
TL;DR: This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments.
Book

Compressed sensing

TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI

Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information

TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Journal ArticleDOI

Atomic Decomposition by Basis Pursuit

TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Related Papers (5)