scispace - formally typeset
Journal ArticleDOI

Approximate maximum likelihood hyperparameter estimation for Gibbs priors

Reads0
Chats0
TLDR
An approximate ML estimator that is computed simultaneously with a maximum a posteriori (MAP) image estimate and the results of a Monte Carlo study that examines the bias and variance of this estimator when applied to image restoration are presented.
Abstract
The parameters of the prior, the hyperparameters, play an important role in Bayesian image estimation. Of particular importance for the case of Gibbs priors is the global hyperparameter, /spl beta/, which multiplies the Hamiltonian. Here we consider maximum likelihood (ML) estimation of /spl beta/ from incomplete data, i.e., problems in which the image, which is drawn from a Gibbs prior, is observed indirectly through some degradation or blurring process. Important applications include image restoration and image reconstruction from projections. Exact ML estimation of /spl beta/ from incomplete data is intractable for most image processing. Here we present an approximate ML estimator that is computed simultaneously with a maximum a posteriori (MAP) image estimate. The algorithm is based on a mean field approximation technique through which multidimensional Gibbs distributions are approximated by a separable function equal to a product of one-dimensional (1-D) densities. We show how this approach can be used to simplify the ML estimation problem. We also show how the Gibbs-Bogoliubov-Feynman (GBF) bound can be used to optimize the approximation for a restricted class of problems. We present the results of a Monte Carlo study that examines the bias and variance of this estimator when applied to image restoration.

read more

Citations
More filters
Journal ArticleDOI

Gradient-based iterative image reconstruction scheme for time-resolved optical tomography

TL;DR: Numerical studies suggest that intraventricular hemorrhages can be detected using the GIIR technique, even in the presence of a heterogeneous background.
Book ChapterDOI

Statistical Image Reconstruction Methods for Transmission Tomography

TL;DR: In this article, the authors discuss algorithms for reconstructing attenuation images from low-count transmission scans, defined as the mean number of photons per ray is small enough that traditional filtered-backproject on (FBP) images, or even methods based on the Gaussian approximation to the distribution of the Poisson measurements (or logarithm thereof), are inadequate.
Journal ArticleDOI

Statistical approaches in quantitative positron emissiontomography

TL;DR: Recent progress in developing statistical approaches to image estimation that can overcome limitations in direct modeling of the detector system or of the inherent statistical fluctuations in the data is reviewed.
Journal ArticleDOI

Spatiotemporal reconstruction of list-mode PET data

TL;DR: A method for computing a continuous time estimate of tracer density using list-mode positron emission tomography data using an inhomogeneous Poisson process whose rate function can be represented using a cubic B-spline basis is described.
References
More filters
Journal ArticleDOI

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Book

Spline models for observational data

Grace Wahba
TL;DR: In this paper, a theory and practice for the estimation of functions from noisy data on functionals is developed, where convergence properties, data based smoothing parameter selection, confidence intervals, and numerical methods are established which are appropriate to a number of problems within this framework.
Book

Linear and nonlinear programming

TL;DR: Strodiot and Zentralblatt as discussed by the authors introduced the concept of unconstrained optimization, which is a generalization of linear programming, and showed that it is possible to obtain convergence properties for both standard and accelerated steepest descent methods.
Journal ArticleDOI

On the statistical analysis of dirty pictures

TL;DR: In this paper, the authors proposed an iterative method for scene reconstruction based on a non-degenerate Markov Random Field (MRF) model, where the local characteristics of the original scene can be represented by a nondegenerate MRF and the reconstruction can be estimated according to standard criteria.
Related Papers (5)