scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Compressed sensing electron tomography.

01 Aug 2013-Ultramicroscopy (North-Holland)-Vol. 131, pp 70-91
TL;DR: The CS-ET approach enables more reliable quantitative analysis of the reconstructions as well as novel 3D studies from extremely limited data, and robust reconstruction is shown to be possible from far fewer projections than are normally used.
About: This article is published in Ultramicroscopy.The article was published on 2013-08-01. It has received 230 citations till now. The article focuses on the topics: Compressed sensing.
Citations
More filters
Journal ArticleDOI
03 Oct 2013-Nature
TL;DR: 3D images related to LSPRs of an individual silver nanocube can be reconstructed through the application of electron energy-loss spectrum imaging, mapping the excitation across a range of orientations, with a novel combination of non-negative matrix factorization, compressed sensing and electron tomography.
Abstract: The remarkable optical properties of metal nanoparticles are governed by the excitation of localized surface plasmon resonances (LSPRs). The sensitivity of each LSPR mode, whose spatial distribution and resonant energy depend on the nanoparticle structure, composition and environment, has given rise to many potential photonic, optoelectronic, catalytic, photovoltaic, and gas- and bio-sensing applications. However, the precise interplay between the three-dimensional (3D) nanoparticle structure and the LSPRs is not always fully understood and a spectrally sensitive 3D imaging technique is needed to visualize the excitation on the nanometre scale. Here we show that 3D images related to LSPRs of an individual silver nanocube can be reconstructed through the application of electron energy-loss spectrum imaging, mapping the excitation across a range of orientations, with a novel combination of non-negative matrix factorization, compressed sensing and electron tomography. Our results extend the idea of substrate-mediated hybridization of dipolar and quadrupolar modes predicted by theory, simulations, and electron and optical spectroscopy, and provide experimental evidence of higher-energy mode hybridization. This work represents an advance both in the understanding of the optical response of noble-metal nanoparticles and in the probing, analysis and visualization of LSPRs.

445 citations

Journal ArticleDOI
TL;DR: The broad framework of understanding and predicting the structure of nanoparticles via diverse Wulff constructions, either thermodynamic, local minima or kinetic has been exceedingly successful, however, the field is still developing and there remain many unknowns and new avenues for research.
Abstract: Nanoparticles can be beautiful, as in stained glass windows, or they can be ugly as in wear and corrosion debris from implants. We estimate that there will be about 70,000 papers in 2015 with nanoparticles as a keyword, but only one in thirteen uses the nanoparticle shape as an additional keyword and research focus, and only one in two hundred has thermodynamics. Methods for synthesizing nanoparticles have exploded over the last decade, but our understanding of how and why they take their forms has not progressed as fast. This topical review attempts to take a critical snapshot of the current understanding, focusing more on methods to predict than a purely synthetic or descriptive approach. We look at models and themes which are largely independent of the exact synthetic method whether it is deposition, gas-phase condensation, solution based or hydrothermal synthesis. Elements are old dating back to the beginning of the 20th century-some of the pioneering models developed then are still relevant today. Others are newer, a merging of older concepts such as kinetic-Wulff constructions with methods to understand minimum energy shapes for particles with twins. Overall we find that while there are still many unknowns, the broad framework of understanding and predicting the structure of nanoparticles via diverse Wulff constructions, either thermodynamic, local minima or kinetic has been exceedingly successful. However, the field is still developing and there remain many unknowns and new avenues for research, a few of these being suggested towards the end of the review.

221 citations

Journal ArticleDOI
23 Sep 2016-Science
TL;DR: The combination of AET and atom-tracing algorithms has enabled the determination of the coordinates of individual atoms and point defects in materials with a 3D precision, allowing direct measurements of 3D atomic displacements and the full strain tensor.
Abstract: Crystallography has been fundamental to the development of many fields of science over the last century. However, much of our modern science and technology relies on materials with defects and disorders, and their three-dimensional (3D) atomic structures are not accessible to crystallography. One method capable of addressing this major challenge is atomic electron tomography. By combining advanced electron microscopes and detectors with powerful data analysis and tomographic reconstruction algorithms, it is now possible to determine the 3D atomic structure of crystal defects such as grain boundaries, stacking faults, dislocations, and point defects, as well as to precisely localize the 3D coordinates of individual atoms in materials without assuming crystallinity. Here we review the recent advances and the interdisciplinary science enabled by this methodology. We also outline further research needed for atomic electron tomography to address long-standing unresolved problems in the physical sciences.

197 citations


Cites background from "Compressed sensing electron tomogra..."

  • ...If the sparse domain can be found, then the 3D structure can be reconstructed from a very small number of images (89)....

    [...]

  • ...It is not straightforward to choose these parameters, especially with thepresence of noise and experimental errors (89)....

    [...]

Journal ArticleDOI
TL;DR: This poster presents a probabilistic procedure for estimating the response of the immune system to x-ray diffraction and describes three different types of responses: “normal”, “spatially predictable” and “drug-like”.
Abstract: NSFC of China [21133010, 21203215, 51221264, 21261160487]; MOST [2011CBA00504]; Strategic Priority Research Program of the Chinese Academy of Sciences [XDA09030103]

186 citations

Journal ArticleDOI
TL;DR: The common basis for 3D characterization is discussed, difficulties and solutions regarding both hard and soft materials research are specified, and it is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated.
Abstract: Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated.

148 citations

References
More filters
Journal ArticleDOI

37,017 citations


"Compressed sensing electron tomogra..." refers result in this paper

  • ...Similar results were also observed in these simulations using Otsu’s method [59], which seeks the optimal separation between nanoparticles and background based on minimising the intraclass variance in the image histogram, and should be effective when there is a clear distinction between the intensity of the nanoparticles and that of the background....

    [...]

Book
D.L. Donoho1
01 Jan 2004
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Abstract: Suppose x is an unknown vector in Ropfm (a digital image or signal); we plan to measure n general linear functionals of x and then reconstruct. If x is known to be compressible by transform coding with a known transform, and we reconstruct via the nonlinear procedure defined here, the number of measurements n can be dramatically smaller than the size m. Thus, certain natural classes of images with m pixels need only n=O(m1/4log5/2(m)) nonadaptive nonpixel samples for faithful recovery, as opposed to the usual m pixel samples. More specifically, suppose x has a sparse representation in some orthonormal basis (e.g., wavelet, Fourier) or tight frame (e.g., curvelet, Gabor)-so the coefficients belong to an lscrp ball for 0

18,609 citations

Book
01 Jan 1998
TL;DR: An introduction to a Transient World and an Approximation Tour of Wavelet Packet and Local Cosine Bases.
Abstract: Introduction to a Transient World. Fourier Kingdom. Discrete Revolution. Time Meets Frequency. Frames. Wavelet Zoom. Wavelet Bases. Wavelet Packet and Local Cosine Bases. An Approximation Tour. Estimations are Approximations. Transform Coding. Appendix A: Mathematical Complements. Appendix B: Software Toolboxes.

17,693 citations

Journal ArticleDOI
TL;DR: In this article, a constrained optimization type of numerical algorithm for removing noise from images is presented, where the total variation of the image is minimized subject to constraints involving the statistics of the noise.

15,225 citations


"Compressed sensing electron tomogra..." refers background in this paper

  • ...Finite differences sparsity, evaluated as the ‘1-norm of the spatial gradients of the image, is often referred to as the total variation (TV)-norm [38]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Abstract: This paper considers the model problem of reconstructing an object from incomplete frequency samples. Consider a discrete-time signal f/spl isin/C/sup N/ and a randomly chosen set of frequencies /spl Omega/. Is it possible to reconstruct f from the partial knowledge of its Fourier coefficients on the set /spl Omega/? A typical result of this paper is as follows. Suppose that f is a superposition of |T| spikes f(t)=/spl sigma//sub /spl tau//spl isin/T/f(/spl tau/)/spl delta/(t-/spl tau/) obeying |T|/spl les/C/sub M//spl middot/(log N)/sup -1/ /spl middot/ |/spl Omega/| for some constant C/sub M/>0. We do not know the locations of the spikes nor their amplitudes. Then with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the /spl lscr//sub 1/ minimization problem. In short, exact recovery may be obtained by solving a convex optimization problem. We give numerical values for C/sub M/ which depend on the desired probability of success. Our result may be interpreted as a novel kind of nonlinear sampling theorem. In effect, it says that any signal made out of |T| spikes may be recovered by convex programming from almost every set of frequencies of size O(|T|/spl middot/logN). Moreover, this is nearly optimal in the sense that any method succeeding with probability 1-O(N/sup -M/) would in general require a number of frequency samples at least proportional to |T|/spl middot/logN. The methodology extends to a variety of other situations and higher dimensions. For example, we show how one can reconstruct a piecewise constant (one- or two-dimensional) object from incomplete frequency samples - provided that the number of jumps (discontinuities) obeys the condition above - by minimizing other convex functionals such as the total variation of f.

14,587 citations