scispace - formally typeset
Search or ask a question
Author

Lei Tian

Bio: Lei Tian is an academic researcher from Boston University. The author has contributed to research in topics: Phase retrieval & Phase (waves). The author has an hindex of 34, co-authored 192 publications receiving 5027 citations. Previous affiliations of Lei Tian include Massachusetts Institute of Technology & University of California, Berkeley.


Papers
More filters
Journal ArticleDOI
TL;DR: A multiplexed illumination strategy in which multiple randomly selected LEDs are turned on for each image so that the total number of images can be significantly reduced, without sacrificing image quality.
Abstract: Fourier Ptychography is a new computational microscopy technique that achieves gigapixel images with both wide field of view and high resolution in both phase and amplitude. The hardware setup involves a simple replacement of the microscope's illumination unit with a programmable LED array, allowing one to flexibly pattern illumination angles without any moving parts. In previous work, a series of low-resolution images was taken by sequentially turning on each single LED in the array, and the data were then combined to recover a bandwidth much higher than the one allowed by the original imaging system. Here, we demonstrate a multiplexed illumination strategy in which multiple randomly selected LEDs are turned on for each image. Since each LED corresponds to a different area of Fourier space, the total number of images can be significantly reduced, without sacrificing image quality. We demonstrate this method experimentally in a modified commercial microscope. Compared to sequential scanning, our multiplexed strategy achieves similar results with approximately an order of magnitude reduction in both acquisition time and data capture requirements.

510 citations

Journal ArticleDOI
20 Feb 2015
TL;DR: In this article, the Fourier ptychography was used to estimate the 3D complex transmittance function of the sample at multiple depths, without any weak or single-scattering approximations.
Abstract: Realizing high resolution across large volumes is challenging for 3D imaging techniques with high-speed acquisition. Here, we describe a new method for 3D intensity and phase recovery from 4D light field measurements, achieving enhanced resolution via Fourier ptychography. Starting from geometric optics light field refocusing, we incorporate phase retrieval and correct diffraction artifacts. Further, we incorporate dark-field images to achieve lateral resolution beyond the diffraction limit of the objective (5× larger NA) and axial resolution better than the depth of field, using a low-magnification objective with a large field of view. Our iterative reconstruction algorithm uses a multislice coherent model to estimate the 3D complex transmittance function of the sample at multiple depths, without any weak or single-scattering approximations. Data are captured by an LED array microscope with computational illumination, which enables rapid scanning of angles for fast acquisition. We demonstrate the method with thick biological samples in a modified commercial microscope, indicating the technique’s versatility for a wide range of applications.

403 citations

Journal ArticleDOI
20 Oct 2018
TL;DR: In this article, the authors proposed a statistical "one-to-all" deep learning (DL) technique that encapsulates a wide range of statistical variations for the model to be resilient to speckle decorrelations.
Abstract: Imaging through scattering is an important yet challenging problem. Tremendous progress has been made by exploiting the deterministic input–output “transmission matrix” for a fixed medium. However, this “one-to-one” mapping is highly susceptible to speckle decorrelations – small perturbations to the scattering medium lead to model errors and severe degradation of the imaging performance. Our goal here is to develop a new framework that is highly scalable to both medium perturbations and measurement requirement. To do so, we propose a statistical “one-to-all” deep learning (DL) technique that encapsulates a wide range of statistical variations for the model to be resilient to speckle decorrelations. Specifically, we develop a convolutional neural network (CNN) that is able to learn the statistical information contained in the speckle intensity patterns captured on a set of diffusers having the same macroscopic parameter. We then show for the first time, to the best of our knowledge, that the trained CNN is able to generalize and make high-quality object predictions through an entirely different set of diffusers of the same class. Our work paves the way to a highly scalable DL approach for imaging through scattering media.

369 citations

Journal ArticleDOI
TL;DR: A method for improving the accuracy of phase retrieval based on the Transport of Intensity equation is demonstrated by using intensity measurements at multiple planes to estimate and remove the artifacts due to higher order axial derivatives.
Abstract: We demonstrate a method for improving the accuracy of phase retrieval based on the Transport of Intensity equation by using intensity measurements at multiple planes to estimate and remove the artifacts due to higher order axial derivatives. We suggest two similar methods of higher order correction, and demonstrate their ability for accurate phase retrieval well beyond the ‘linear’ range of defocus that TIE imaging traditionally requires. Computation is fast and efficient, and sensitivity to noise is reduced by using many images.

325 citations

Journal ArticleDOI
TL;DR: In this article, the authors compare and classify multiple Fourier ptychography inverse algorithms in terms of experimental robustness and find that the main sources of error are noise, aberrations and mis-calibration (i.e. model mis-match).
Abstract: Fourier ptychography is a new computational microscopy technique that provides gigapixel-scale intensity and phase images with both wide field-of-view and high resolution. By capturing a stack of low-resolution images under different illumination angles, an inverse algorithm can be used to computationally reconstruct the high-resolution complex field. Here, we compare and classify multiple proposed inverse algorithms in terms of experimental robustness. We find that the main sources of error are noise, aberrations and mis-calibration (i.e. model mis-match). Using simulations and experiments, we demonstrate that the choice of cost function plays a critical role, with amplitude-based cost functions performing better than intensity-based ones. The reason for this is that Fourier ptychography datasets consist of images from both brightfield and darkfield illumination, representing a large range of measured intensities. Both noise (e.g. Poisson noise) and model mis-match errors are shown to scale with intensity. Hence, algorithms that use an appropriate cost function will be more tolerant to both noise and model mis-match. Given these insights, we propose a global Newton’s method algorithm which is robust and accurate. Finally, we discuss the impact of procedures for algorithmic correction of aberrations and mis-calibration.

280 citations


Cited by
More filters
01 Jan 2016
TL;DR: In this paper, the authors present the principles of optics electromagnetic theory of propagation interference and diffraction of light, which can be used to find a good book with a cup of coffee in the afternoon, instead of facing with some infectious bugs inside their computer.
Abstract: Thank you for reading principles of optics electromagnetic theory of propagation interference and diffraction of light. As you may know, people have search hundreds times for their favorite novels like this principles of optics electromagnetic theory of propagation interference and diffraction of light, but end up in harmful downloads. Rather than enjoying a good book with a cup of coffee in the afternoon, instead they are facing with some infectious bugs inside their computer.

2,213 citations

Journal Article
J. Walkup1
TL;DR: Development of this more comprehensive model of the behavior of light draws upon the use of tools traditionally available to the electrical engineer, such as linear system theory and the theory of stochastic processes.
Abstract: Course Description This is an advanced course in which we explore the field of Statistical Optics. Topics covered include such subjects as the statistical properties of natural (thermal) and laser light, spatial and temporal coherence, effects of partial coherence on optical imaging instruments, effects on imaging due to randomly inhomogeneous media, and a statistical treatment of the detection of light. Development of this more comprehensive model of the behavior of light draws upon the use of tools traditionally available to the electrical engineer, such as linear system theory and the theory of stochastic processes.

1,364 citations

Journal ArticleDOI
TL;DR: In this article, a nonconvex formulation of the phase retrieval problem was proposed and a concrete solution algorithm was presented. But the main contribution is that this algorithm is shown to rigorously allow the exact retrieval of phase information from a nearly minimal number of random measurements.
Abstract: We study the problem of recovering the phase from magnitude measurements; specifically, we wish to reconstruct a complex-valued signal $ \boldsymbol {x}\in \mathbb {C}^{n}$ about which we have phaseless samples of the form $y_{r} = \left |{\langle \boldsymbol {a}_{r}, \boldsymbol {x} \rangle }\right |^{2}$ , $r = 1,\ldots , m$ (knowledge of the phase of these samples would yield a linear system). This paper develops a nonconvex formulation of the phase retrieval problem as well as a concrete solution algorithm. In a nutshell, this algorithm starts with a careful initialization obtained by means of a spectral method, and then refines this initial estimate by iteratively applying novel update rules, which have low computational complexity, much like in a gradient descent scheme. The main contribution is that this algorithm is shown to rigorously allow the exact retrieval of phase information from a nearly minimal number of random measurements. Indeed, the sequence of successive iterates provably converges to the solution at a geometric rate so that the proposed scheme is efficient both in terms of computational and data resources. In theory, a variation on this scheme leads to a near-linear time algorithm for a physically realizable model based on coded diffraction patterns. We illustrate the effectiveness of our methods with various experiments on image data. Underlying our analysis are insights for the analysis of nonconvex optimization schemes that may have implications for computational problems beyond phase retrieval.

1,096 citations

Journal ArticleDOI
TL;DR: The goal is to describe the current state of the art in this area, identify challenges, and suggest future directions and areas where signal processing methods can have a large impact on optical imaging and on the world of imaging at large.
Abstract: i»?The problem of phase retrieval, i.e., the recovery of a function given the magnitude of its Fourier transform, arises in various fields of science and engineering, including electron microscopy, crystallography, astronomy, and optical imaging. Exploring phase retrieval in optical settings, specifically when the light originates from a laser, is natural since optical detection devices [e.g., charge-coupled device (CCD) cameras, photosensitive films, and the human eye] cannot measure the phase of a light wave. This is because, generally, optical measurement devices that rely on converting photons to electrons (current) do not allow for direct recording of the phase: the electromagnetic field oscillates at rates of ~1015 Hz, which no electronic measurement device can follow. Indeed, optical measurement/detection systems measure the photon flux, which is proportional to the magnitude squared of the field, not the phase. Consequently, measuring the phase of optical waves (electromagnetic fields oscillating at 1015 Hz and higher) involves additional complexity, typically by requiring interference with another known field, in the process of holography.

869 citations

Journal ArticleDOI
TL;DR: This study reviews recent advances in UQ methods used in deep learning and investigates the application of these methods in reinforcement learning (RL), and outlines a few important applications of UZ methods.
Abstract: Uncertainty quantification (UQ) plays a pivotal role in reduction of uncertainties during both optimization and decision making processes. It can be applied to solve a variety of real-world applications in science and engineering. Bayesian approximation and ensemble learning techniques are two most widely-used UQ methods in the literature. In this regard, researchers have proposed different UQ methods and examined their performance in a variety of applications such as computer vision (e.g., self-driving cars and object detection), image processing (e.g., image restoration), medical image analysis (e.g., medical image classification and segmentation), natural language processing (e.g., text classification, social media texts and recidivism risk-scoring), bioinformatics, etc. This study reviews recent advances in UQ methods used in deep learning. Moreover, we also investigate the application of these methods in reinforcement learning (RL). Then, we outline a few important applications of UQ methods. Finally, we briefly highlight the fundamental research challenges faced by UQ methods and discuss the future research directions in this field.

809 citations