scispace - formally typeset
Proceedings ArticleDOI

Bayesian imaging using Good's roughness measure-implementation on a massively parallel processor

Reads0
Chats0
TLDR
A constrained maximum-likelihood estimator is derived by incorporating a rotationally invariant roughness penalty proposed by I.J. Good (1981) into the likelihood functional, which leads to a set of nonlinear differential equations the solution of which is a spline-smoothing of the data.
Abstract
A constrained maximum-likelihood estimator is derived by incorporating a rotationally invariant roughness penalty proposed by I.J. Good (1981) into the likelihood functional. This leads to a set of nonlinear differential equations the solution of which is a spline-smoothing of the data. The nonlinear partial differential equations are mapped onto a grid via finite differences, and it is shown that the resulting computations possess a high degree of parallelism as well as locality in the data-passage, which allows an efficient implementation on a 48-by-48 mesh-connected array of NCR GAPP processors. The smooth reconstruction of the intensity functions of Poisson point processes is demonstrated in two dimensions. >

read more

Citations
More filters
Journal ArticleDOI

A smoothed EM approach to indirect estimation problems, with particular reference to stereology and emission tomography

TL;DR: In this paper, the authors modify the maximum likelihood-EM approach by introducing a simple smoothing step at each EM iteration, which converges in relatively few iterations to good estimates of g that do not depend on the choice of starting configuration.
Journal ArticleDOI

Maximum likelihood SPECT in clinical computation times using mesh-connected parallel computers

TL;DR: The authors show that for SPECT imaging on 64x64 image grids, the single-instruction, multiple data (SIMD) distributed array processor containing 64(2) processors performs the expectation-maximization (EM) algorithm with Good's smoothing at a rate of 1 iteration/1.5 s, promising for emission tomography fully Bayesian reconstructions including regularization in clinical computation times which are on the order of 1 min/slice.
Journal ArticleDOI

3-D maximum a posteriori estimation for single photon emission computed tomography on massively-parallel computers

TL;DR: A fully three-dimensional (3-D) implementation of the maximum a posteriori (MAP) method for single photon emission computed tomography (SPECT) is demonstrated, and the 3-D reconstruction exhibits a major increase in resolution when compared to the generation of the series of separate 2-D slice reconstructions.
Journal ArticleDOI

Maximum a posteriori estimation for SPECT using regularization techniques on massively parallel computers

TL;DR: Single photon emission computed tomography (SPECT) reconstructions were performed using maximum a posteriori (penalized likelihood) estimation via the expectation maximization algorithm on a massively parallel single-instruction multiple-data computer.
Proceedings ArticleDOI

Maximum a posteriori estimation for SPECT using regularization techniques on massively-parallel computers

TL;DR: Single photon emission computed tomography (SPECT) reconstructions were performed using maximum a posteriori (penalized likelihood) estimation via the expectation maximization algorithm on a massively parallel single-instruction multiple-data computer.
References
More filters
Journal ArticleDOI

Nonparametric roughness penalties for probability densities

I. J. Good
- 01 Aug 1971 - 
TL;DR: A method is presented here that should help to overcome the difficulty of deciding whether “bumps” are genuinely in the population.
Journal ArticleDOI

The Use of Sieves to Stabilize Images Produced with the EM Algorithm for Emission Tomography

TL;DR: It is shown how Grenader's method of sieves can be used with the EM algorithm to remove the instability and thereby decrease the 'noise' artifact introduced into the images with little or no increase in computational complexity.
Journal ArticleDOI

The role of likelihood and entropy in incomplete-data problems: Applications to estimating point-process intensities and toeplitz constrained covariances

TL;DR: This paper concludes that the density maximizing entropy is identical to the conditional density of the complete data given the incomplete data, and derives a recursive algorithm for the generation of Toeplitz constrained maximum-likelihood estimators which at each iteration evaluates conditional mean estimates of the lag products based on the previous estimate of the covariance.
Related Papers (5)