scispace - formally typeset
Search or ask a question

Showing papers by "Richard M. Leahy published in 1995"


Proceedings ArticleDOI
09 May 1995
TL;DR: The authors propose a new class of adaptive algorithms for ANC that are based on the minimization of a fractional lower order moment, p<2, and observe that superior performance is obtained by choosing p/spl ap//spl alpha/ where /splalpha/<2 is a parameter reflecting the degree of impulsiveness of the noise.
Abstract: Describes a new class of algorithms for active noise control (ANC) for use in environments in which impulsive noise is present. The well known filtered-X and filtered-U ANC algorithms are designed to minimize the variance of a measured error signal. For impulsive noise, which can be modeled using non-Gaussian stable processes, these standard approaches are not appropriate since the second order moments do not exist. The authors propose a new class of adaptive algorithms for ANC that are based on the minimization of a fractional lower order moment, p<2. By studying the effect of p on the convergence behavior of adaptive algorithms, they observe that superior performance is obtained by choosing p/spl ap//spl alpha/ where /spl alpha/<2 is a parameter reflecting the degree of impulsiveness of the noise. Applications of this approach to noise cancellation in a duct are presented.

91 citations


Book ChapterDOI
21 Oct 1995
TL;DR: A Bayesian formulation of the inverse problem in which a Gibbs prior is constructed to reflect the sparse focal nature of the current sources and its performance is compared with several weighted minimum norm methods.
Abstract: The authors describe a new approach to imaging neuronal currents from measurements of the magnetoencephalogram (MEG) associated with sensory, motor, or cognitive brain activation Previous approaches to this problem have concentrated on the use of weighted minimum norm inverse methods While these methods ensure a unique solution, they do not introduce information specific to the MEG inverse problem, often producing overly smoothed solutions and exhibiting severe sensitivity to noise The authors describe a Bayesian formulation of the inverse problem in which a Gibbs prior is constructed to reflect the sparse focal nature of the current sources The authors demonstrate the method with simulated and experimental phantom data, comparing its performance with several weighted minimum norm methods

71 citations


Proceedings ArticleDOI
23 Oct 1995
TL;DR: An approximate ML estimator for the hyperparameters of a Gibbs prior which can be computed simultaneously with a maximum a posteriori (MAP) image estimate is described.
Abstract: We describe an approximate ML estimator for the hyperparameters of a Gibbs prior which can be computed simultaneously with a maximum a posteriori (MAP) image estimate. The algorithm is based on a mean field approximation technique through which multidimensional Gibbs distributions are approximated by a separable function equal to a product of one dimensional densities. We show how this approach can be used to simplify the ML estimation problem. We also show how the Gibbs-Bogoliubov-Feynman bound can be used to optimize the approximation for a restricted class of problems.

67 citations


Proceedings ArticleDOI
09 May 1995
TL;DR: The authors present the matrix kernels for the general boundary element model (BEM) and for MEG spherical models and show how these kernels are easily interchanged in a linear algebraic framework that includes sensor specifics such as orientation and gradiometer configuration.
Abstract: The most widely used model for electroencephalography (EEG) and magnetoencephalography (MEG) assumes a quasi-static approximation of Maxwell's equations and a piecewise homogeneous conductor model. Both models contain an incremental field element that linearly relates an incremental source element (current dipole) to the field or voltage at a distant point. The explicit form of the field element is dependent on the head modeling assumptions and sensor configuration. Proper characterization of this incremental element is crucial to the inverse problem. The field element can be partitioned into the product of a vector dependent on sensor characteristics and a matrix kernel dependent only on head modeling assumptions. The authors present the matrix kernels for the general boundary element model (BEM) and for MEG spherical models. They show how these kernels are easily interchanged in a linear algebraic framework that includes sensor specifics such as orientation and gradiometer configuration. They then describe how this kernel is easily applied to "gain" or "transfer" matrices used in multiple dipole and source imaging models.

30 citations


Journal ArticleDOI
TL;DR: In this article, a nonlinear regression model was developed to estimate the displacement field associated with permanent deformations of 3D composite objects with complex internal structure for fields satisfying the small displacement gradient approximation of continuum mechanics.
Abstract: We present a new method for computing the internal displacement fields associated with permanent deformations of 3D composite objects with complex internal structure for fields satisfying the small displacement gradient approximation of continuum mechanics. We compute the displacement fields from a sequence of 3D X-ray computed tomography (CT) images. By assuming that the intensity of the tomographic images represents a conserved property which is incompressible, we develop a constrained nonlinear regression model for estimation of the displacement field. Successive linear approximation is then employed and each linear subsidiary problem is solved using variational calculus. We approximate the resulting Euler-Lagrange equations using a finite set of linear equations using finite differencing methods. We solve these equations using a conjugate gradient algorithm in a multiresolution framework. We validate our method using pairs of synthetic images of plane shear flow. Finally, we determine the 3D displacement field in the interior of a cylindrical asphalt/aggregate core loaded to a state of permanent deformation.

24 citations


Proceedings ArticleDOI
09 May 1995
TL;DR: The authors examine distributed source reconstruction in a Bayesian framework to highlight the implicit nonphysical Gaussian assumptions of minimum norm based reconstruction algorithms.
Abstract: In neuromagnetic source reconstruction, a functional map of neural activity is constructed from noninvasive magnetoencephalographic (MEG) measurements. The overall reconstruction problem is under-determined, so some form of source modeling must be applied. The authors review the two main classes of reconstruction techniques-parametric current dipole models and nonparametric distributed source reconstructions. Current dipole reconstructions use a physically plausible source model, but are limited to cases in which the neural currents are expected to be highly sparse and localized. Distributed source reconstructions can be applied to a wider variety of cases, but must incorporate an implicit source model in order to arrive at a single reconstruction. The authors examine distributed source reconstruction in a Bayesian framework to highlight the implicit nonphysical Gaussian assumptions of minimum norm based reconstruction algorithms. They conclude with a brief discussion of alternative non-Gaussian approaches.

7 citations


Proceedings ArticleDOI
21 Oct 1995
TL;DR: The authors' protocol uses a rapidly converging algorithm to compute a MAP estimate of the 3D PET image and includes novel methods for accurate attenuation and scatter modeling and a Gibbs prior whose hyperparameters are estimated simultaneously with the PET image so that no critical user specified parameters are required.
Abstract: The authors examine the behavior of a Bayesian protocol for reconstruction of PET images in terms of the quantitative accuracy of the resulting Region-of-Interest (ROI) statistics. The authors' protocol uses a rapidly converging algorithm to compute a MAP estimate of the 3D PET image and includes novel methods for accurate attenuation and scatter modeling. The image is modeled using a Gibbs prior whose hyperparameters are estimated simultaneously with the PET image so that no critical user specified parameters are required. The authors computed ROI statistics over reconstructions of multiple frames of a computer generated brain phantom and an experimental chest phantom scanned using a Siemens/CTI ECAT931 scanner. Results indicate significant improvements in both bias and variance when compared to a standard filtered backprojection (FBP) based protocol.

4 citations