scispace - formally typeset
Search or ask a question
Author

Eric L. Miller

Bio: Eric L. Miller is an academic researcher from Tufts University. The author has contributed to research in topics: Inverse problem & Iterative reconstruction. The author has an hindex of 54, co-authored 399 publications receiving 9583 citations. Previous affiliations of Eric L. Miller include Verizon Communications & University of Maryland, Baltimore County.


Papers
More filters
Journal ArticleDOI
TL;DR: The basic idea of DOT is introduced, the history of optical methods in medicine is reviewed, and a review of the tissue's optical properties, modes of operation for DOT, and the challenges which the development of DOT must overcome are detailed.
Abstract: Diffuse optical tomography (DOT) is an ongoing medical imaging modality in which tissue is illuminated by near-infrared light from an array of sources, the multiply-scattered light which emerges is observed with an array of detectors, and then a model of the propagation physics is used to infer the localized optical properties of the illuminated tissue. The three primary absorbers at these wavelengths, water and both oxygenated and deoxygenated hemoglobin, all have relatively weak absorption. This fortuitous fact provides a spectral window through which we can attempt to localize absorption (primarily by the two forms of hemoglobin) and scattering in the tissue. The most important current applications of DOT are detecting tumors in the breast and imaging the brain. We introduce the basic idea of DOT and review the history of optical methods in medicine as relevant to the development of DOT. We then detail the concept of DOT, including a review of the tissue's optical properties, modes of operation for DOT, and the challenges which the development of DOT must overcome. The basics of modelling the DOT forward problem and some critical issues among the numerous implementations that have been investigated for the DOT inverse problem, with an emphasis on signal processing. We summarize with some specific results as examples of the current state of DOT research.

770 citations

Journal ArticleDOI
TL;DR: The generalized tensor nuclear norm can be used as a standalone regularization technique for the energy selective (spectral) computed tomography problem and when combined with total variation regularization it enhances the regularization capabilities especially at low energy images where the effects of noise are most prominent.
Abstract: The development of energy selective, photon counting X-ray detectors allows for a wide range of new possibilities in the area of computed tomographic image formation. Under the assumption of perfect energy resolution, here we propose a tensor-based iterative algorithm that simultaneously reconstructs the X-ray attenuation distribution for each energy. We use a multilinear image model rather than a more standard stacked vector representation in order to develop novel tensor-based regularizers. In particular, we model the multispectral unknown as a three-way tensor where the first two dimensions are space and the third dimension is energy. This approach allows for the design of tensor nuclear norm regularizers, which like its 2D counterpart, is a convex function of the multispectral unknown. The solution to the resulting convex optimization problem is obtained using an alternating direction method of multipliers approach. Simulation results show that the generalized tensor nuclear norm can be used as a standalone regularization technique for the energy selective (spectral) computed tomography problem and when combined with total variation regularization it enhances the regularization capabilities especially at low energy images where the effects of noise are most prominent.

323 citations

Journal ArticleDOI
TL;DR: These pilicides target key virulence Factors in pathogenic bacteria and represent a promising proof of concept for developing drugs that function by targeting virulence factors.
Abstract: A chemical synthesis platform with broad applications and flexibility was rationally designed to inhibit biogenesis of adhesive pili assembled by the chaperone–usher pathway in Gram-negative pathogens. The activity of a family of bicyclic 2-pyridones, termed pilicides, was evaluated in two different pilus biogenesis systems in uropathogenic Escherichia coli. Hemagglutination mediated by either type 1 or P pili, adherence to bladder cells, and biofilm formation mediated by type 1 pili were all reduced by ≈90% in laboratory and clinical E. coli strains. The structure of the pilicide bound to the P pilus chaperone PapD revealed that the pilicide bound to the surface of the chaperone known to interact with the usher, the outer-membrane assembly platform where pili are assembled. Point mutations in the pilicide-binding site dramatically reduced pilus formation but did not block the ability of PapD to bind subunits and mediate their folding. Surface plasmon resonance experiments confirmed that the pilicide interfered with the binding of chaperone–subunit complexes to the usher. These pilicides thus target key virulence factors in pathogenic bacteria and represent a promising proof of concept for developing drugs that function by targeting virulence factors.

292 citations

Journal ArticleDOI
TL;DR: In this article, a two-step shape reconstruction method for electromagnetic (EM) tomography is presented which uses adjoint fields and level sets, and the main application is the imaging and monitoring of pollutant plumes in environmental cleanup sites based on cross-borehole EM data.
Abstract: A two-step shape reconstruction method for electromagnetic (EM) tomography is presented which uses adjoint fields and level sets. The inhomogeneous background permittivity distribution and the values of the permittivities in some penetrable obstacles are assumed to be known, and the number, sizes, shapes, and locations of these obstacles have to be reconstructed given noisy limited-view EM data. The main application we address in the paper is the imaging and monitoring of pollutant plumes in environmental cleanup sites based on cross-borehole EM data. The first step of the reconstruction scheme makes use of an inverse scattering solver which recovers equivalent scattering sources for a number of experiments, and then calculates from these an approximation for the permittivity distribution in the medium. The second step uses this result as an initial guess for solving the shape reconstruction problem. A key point in this second step is the fusion of the `level set technique' for representing the shapes of the reconstructed obstacles, and an `adjoint field technique' for solving the nonlinear inverse problem. In each step, a forward and an adjoint Helmholtz problem are solved based on the permittivity distribution which corresponds to the latest best guess for the representing level set function. A correction for this level set function is then calculated directly by combining the results of these two runs. Numerical experiments are presented which show that the derived method is able to recover one or more objects with nontrivial shapes given noisy cross-borehole EM data.

260 citations

Journal ArticleDOI
TL;DR: In this paper, the selection of multiple regularization parameters is considered in a generalized L-curve framework, and a minimum distance function (MDF) is developed for approximating the regularization parameter corresponding to the generalized corner of the L-hypersurface.
Abstract: The selection of multiple regularization parameters is considered in a generalized L-curve framework. Multiple-dimensional extensions of the L-curve for selecting multiple regularization parameters are introduced, and a minimum distance function (MDF) is developed for approximating the regularization parameters corresponding to the generalized corner of the L-hypersurface. For the single-parameter (i.e. L-curve) case, it is shown through a model that the regularization parameters minimizing the MDF essentially maximize the curvature of the L-curve. Furthermore, for both the single-and multiple-parameter cases the MDF approach leads to a simple fixed-point iterative algorithm for computing regularization parameters. Examples indicate that the algorithm converges rapidly thereby making the problem of computing parameters according to the generalized corner of the L-hypersurface computationally tractable.

253 citations


Cited by
More filters
Book
01 Jan 1998
TL;DR: An introduction to a Transient World and an Approximation Tour of Wavelet Packet and Local Cosine Bases.
Abstract: Introduction to a Transient World. Fourier Kingdom. Discrete Revolution. Time Meets Frequency. Frames. Wavelet Zoom. Wavelet Bases. Wavelet Packet and Local Cosine Bases. An Approximation Tour. Estimations are Approximations. Transform Coding. Appendix A: Mathematical Complements. Appendix B: Software Toolboxes.

17,693 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

01 Apr 2003
TL;DR: The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it as mentioned in this paper, and also presents new ideas and alternative interpretations which further explain the success of the EnkF.
Abstract: The purpose of this paper is to provide a comprehensive presentation and interpretation of the Ensemble Kalman Filter (EnKF) and its numerical implementation. The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it. This paper reviews the important results from these studies and also presents new ideas and alternative interpretations which further explain the success of the EnKF. In addition to providing the theoretical framework needed for using the EnKF, there is also a focus on the algorithmic formulation and optimal numerical implementation. A program listing is given for some of the key subroutines. The paper also touches upon specific issues such as the use of nonlinear measurements, in situ profiles of temperature and salinity, and data which are available with high frequency in time. An ensemble based optimal interpolation (EnOI) scheme is presented as a cost-effective approach which may serve as an alternative to the EnKF in some applications. A fairly extensive discussion is devoted to the use of time correlated model errors and the estimation of model bias.

2,975 citations