Strategies for reducing speckle noise in digital holography
TL;DR: A broad discussion about the noise issue in DH is provided, with the aim of covering the best-performing noise reduction approaches that have been proposed so far and quantitative comparisons among these approaches will be presented.
Abstract: Digital holography (DH) has emerged as one of the most effective coherent imaging technologies. The technological developments of digital sensors and optical elements have made DH the primary approach in several research fields, from quantitative phase imaging to optical metrology and 3D display technologies, to name a few. Like many other digital imaging techniques, DH must cope with the issue of speckle artifacts, due to the coherent nature of the required light sources. Despite the complexity of the recently proposed de-speckling methods, many have not yet attained the required level of effectiveness. That is, a universal denoising strategy for completely suppressing holographic noise has not yet been established. Thus the removal of speckle noise from holographic images represents a bottleneck for the entire optics and photonics scientific community. This review article provides a broad discussion about the noise issue in DH, with the aim of covering the best-performing noise reduction approaches that have been proposed so far. Quantitative comparisons among these approaches will be presented.
Citations
More filters
01 Jan 2006
TL;DR: Digital holography is a technique that permits digital capture of holograms and subsequent processing on a digital computer as mentioned in this paper, and various applications of this technique cover three-dimensional (3-D) imaging as well as several problems.
Abstract: Digital holography is a technique that permits digital capture of
holograms and subsequent processing on a digital computer. This
paper reviews various applications of this technique. The presented
applications cover three-dimensional (3-D) imaging as well as several
associated problems. For the case of 3-D imaging, optical and
digital methods to reconstruct and visualize the recorded objects
are described. In addition, techniques to compress and encrypt 3-D
information in the form of digital holograms are presented. Lastly,
3-D pattern recognition applications of digital holography are discussed.
The described techniques constitute a comprehensive approach
to 3-D imaging and processing.
179 citations
••
TL;DR: In a discussion of the topic, Yair Rivenson, Yichen Wu, and Aydogan Ozcan explain how once “trained” with appropriate datasets, neural networks can learn to reconstruct images with added benefits such as improved phase recovery and extended depth of field as well as enhanced spatial resolution and superior signal-to-noise ratio.
Abstract: Recent advances in deep learning have given rise to a new paradigm of holographic image reconstruction and phase recovery techniques with real-time performance. Through data-driven approaches, these emerging techniques have overcome some of the challenges associated with existing holographic image reconstruction methods while also minimizing the hardware requirements of holography. These recent advances open up a myriad of new opportunities for the use of coherent imaging systems in biomedical and engineering research and related applications.
175 citations
••
TL;DR: Deep learning-enabled optical metrology is a kind of data-driven approach, which has already provided numerous alternative solutions to many challenging problems in this field with better performances as discussed by the authors .
Abstract: Abstract With the advances in scientific foundations and technological implementations, optical metrology has become versatile problem-solving backbones in manufacturing, fundamental research, and engineering applications, such as quality control, nondestructive testing, experimental mechanics, and biomedicine. In recent years, deep learning, a subfield of machine learning, is emerging as a powerful tool to address problems by learning from data, largely driven by the availability of massive datasets, enhanced computational power, fast data storage, and novel training algorithms for the deep neural network. It is currently promoting increased interests and gaining extensive attention for its utilization in the field of optical metrology. Unlike the traditional “physics-based” approach, deep-learning-enabled optical metrology is a kind of “data-driven” approach, which has already provided numerous alternative solutions to many challenging problems in this field with better performances. In this review, we present an overview of the current status and the latest progress of deep-learning technologies in the field of optical metrology. We first briefly introduce both traditional image-processing algorithms in optical metrology and the basic concepts of deep learning, followed by a comprehensive review of its applications in various optical metrology tasks, such as fringe denoising, phase retrieval, phase unwrapping, subset correlation, and error compensation. The open challenges faced by the current deep-learning approach in optical metrology are then discussed. Finally, the directions for future research are outlined.
165 citations
••
TL;DR: Deep learning-enabled optical metrology is a kind of data-driven approach, which has already provided numerous alternative solutions to many challenging problems in this field with better performances as discussed by the authors .
Abstract: Abstract With the advances in scientific foundations and technological implementations, optical metrology has become versatile problem-solving backbones in manufacturing, fundamental research, and engineering applications, such as quality control, nondestructive testing, experimental mechanics, and biomedicine. In recent years, deep learning, a subfield of machine learning, is emerging as a powerful tool to address problems by learning from data, largely driven by the availability of massive datasets, enhanced computational power, fast data storage, and novel training algorithms for the deep neural network. It is currently promoting increased interests and gaining extensive attention for its utilization in the field of optical metrology. Unlike the traditional “physics-based” approach, deep-learning-enabled optical metrology is a kind of “data-driven” approach, which has already provided numerous alternative solutions to many challenging problems in this field with better performances. In this review, we present an overview of the current status and the latest progress of deep-learning technologies in the field of optical metrology. We first briefly introduce both traditional image-processing algorithms in optical metrology and the basic concepts of deep learning, followed by a comprehensive review of its applications in various optical metrology tasks, such as fringe denoising, phase retrieval, phase unwrapping, subset correlation, and error compensation. The open challenges faced by the current deep-learning approach in optical metrology are then discussed. Finally, the directions for future research are outlined.
95 citations
•
TL;DR: Clear images of multiple cells were obtained with subcellular resolution and good image fidelity, provided that the object dimension was smaller than the maximum scanning range of the speckle pattern.
Abstract: Using optical speckle scanning microscopy [1], we demonstrate that clear images of multiple cells can be obtained through biological scattering tissue, with subcellular resolution and good image quality, as long as the size of the imaging target is smaller than the scanning range of the illuminating speckle pattern.
73 citations
References
More filters
•
01 Jan 1998
TL;DR: An introduction to a Transient World and an Approximation Tour of Wavelet Packet and Local Cosine Bases.
Abstract: Introduction to a Transient World. Fourier Kingdom. Discrete Revolution. Time Meets Frequency. Frames. Wavelet Zoom. Wavelet Bases. Wavelet Packet and Local Cosine Bases. An Approximation Tour. Estimations are Approximations. Transform Coding. Appendix A: Mathematical Complements. Appendix B: Software Toolboxes.
17,693 citations
"Strategies for reducing speckle noi..." refers background in this paper
...Their mathematical descriptions are omitted for brevity; readers can refer to the Supplementary Information file for a more detailed description, as well as the highlighted original papers in which they were proposed.(37,59,120-146)...
[...]
••
TL;DR: A new definition of scale-space is suggested, and a class of algorithms used to realize a diffusion process is introduced, chosen to vary spatially in such a way as to encourage intra Region smoothing rather than interregion smoothing.
Abstract: A new definition of scale-space is suggested, and a class of algorithms used to realize a diffusion process is introduced. The diffusion coefficient is chosen to vary spatially in such a way as to encourage intraregion smoothing rather than interregion smoothing. It is shown that the 'no new maxima should be generated at coarse scales' property of conventional scale space is preserved. As the region boundaries in the approach remain sharp, a high-quality edge detector which successfully exploits global information is obtained. Experimental results are shown on a number of images. Parallel hardware implementations are made feasible because the algorithm involves elementary, local operations replicated over the image. >
12,560 citations
••
TL;DR: The authors prove two results about this type of estimator that are unprecedented in several ways: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures.
Abstract: Donoho and Johnstone (1994) proposed a method for reconstructing an unknown function f on [0,1] from noisy data d/sub i/=f(t/sub i/)+/spl sigma/z/sub i/, i=0, ..., n-1,t/sub i/=i/n, where the z/sub i/ are independent and identically distributed standard Gaussian random variables. The reconstruction f/spl circ/*/sub n/ is defined in the wavelet domain by translating all the empirical wavelet coefficients of d toward 0 by an amount /spl sigma//spl middot//spl radic/(2log (n)/n). The authors prove two results about this type of estimator. [Smooth]: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: the estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. The present proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model. >
9,359 citations
••
TL;DR: An algorithm based on an enhanced sparse representation in transform domain based on a specially developed collaborative Wiener filtering achieves state-of-the-art denoising performance in terms of both peak signal-to-noise ratio and subjective visual quality.
Abstract: We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2D image fragments (e.g., blocks) into 3D data arrays which we call "groups." Collaborative Altering is a special procedure developed to deal with these 3D groups. We realize it using the three successive steps: 3D transformation of a group, shrinkage of the transform spectrum, and inverse 3D transformation. The result is a 3D estimate that consists of the jointly filtered grouped image blocks. By attenuating the noise, the collaborative filtering reveals even the finest details shared by grouped blocks and, at the same time, it preserves the essential unique features of each individual block. The filtered blocks are then returned to their original positions. Because these blocks are overlapping, for each pixel, we obtain many different estimates which need to be combined. Aggregation is a particular averaging procedure which is exploited to take advantage of this redundancy. A significant improvement is obtained by a specially developed collaborative Wiener filtering. An algorithm based on this novel denoising strategy and its efficient implementation are presented in full detail; an extension to color-image denoising is also developed. The experimental results demonstrate that this computationally scalable algorithm achieves state-of-the-art denoising performance in terms of both peak signal-to-noise ratio and subjective visual quality.
7,912 citations
••
20 Jun 2005TL;DR: A new measure, the method noise, is proposed, to evaluate and compare the performance of digital image denoising methods, and a new algorithm, the nonlocal means (NL-means), based on a nonlocal averaging of all pixels in the image is proposed.
Abstract: We propose a new measure, the method noise, to evaluate and compare the performance of digital image denoising methods. We first compute and analyze this method noise for a wide class of denoising algorithms, namely the local smoothing filters. Second, we propose a new algorithm, the nonlocal means (NL-means), based on a nonlocal averaging of all pixels in the image. Finally, we present some experiments comparing the NL-means algorithm and the local smoothing filters.
6,804 citations
"Strategies for reducing speckle noi..." refers background in this paper
...Their mathematical descriptions are omitted for brevity; readers can refer to the Supplementary Information file for a more detailed description, as well as the highlighted original papers in which they were proposed.(37,59,120-146)...
[...]