scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Big data and deep data in scanning and electron microscopies: deriving functionality from multidimensional data sets.

TL;DR: Here, several recent applications of the big and deep data analysis methods are reviewed to visualize, compress, and translate this multidimensional structural and functional data into physically and chemically relevant information.
Abstract: The development of electron and scanning probe microscopies in the second half of the twentieth century has produced spectacular images of the internal structure and composition of matter with nanometer, molecular, and atomic resolution. Largely, this progress was enabled by computer-assisted methods of microscope operation, data acquisition, and analysis. Advances in imaging technology in the beginning of the twenty-first century have opened the proverbial floodgates on the availability of high-veracity information on structure and functionality. From the hardware perspective, high-resolution imaging methods now routinely resolve atomic positions with approximately picometer precision, allowing for quantitative measurements of individual bond lengths and angles. Similarly, functional imaging often leads to multidimensional data sets containing partial or full information on properties of interest, acquired as a function of multiple parameters (time, temperature, or other external stimuli). Here, we review several recent applications of the big and deep data analysis methods to visualize, compress, and translate this multidimensional structural and functional data into physically and chemically relevant information.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, a review of recent advances in the measurement and modeling of thermophysical properties at the nanoscale (from the solid state to colloids) is presented, including thermal conductivity, dynamic viscosity, specific heat capacity, and density.

322 citations

Journal ArticleDOI
TL;DR: New opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest are discussed.
Abstract: Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

295 citations

Journal ArticleDOI
18 Jun 2020-ACS Nano
TL;DR: The technological challenges and opportunities of current bio/chemical sensors and analytical tools are reviewed by critically analyzing the bottlenecks which have hindered the implementation of advanced sensing technologies in pandemic diseases, and holistic insights into challenges associated with the quick translation of sensing technologies, policies, ethical issues, technology adoption are provided.
Abstract: Biosensors and nanoscale analytical tools have shown huge growth in literature in the past 20 years, with a large number of reports on the topic of 'ultrasensitive', 'cost-effective', and 'early detection' tools with a potential of 'mass-production' cited on the web of science Yet none of these tools are commercially available in the market or practically viable for mass production and use in pandemic diseases such as coronavirus disease 2019 (COVID-19) In this context, we review the technological challenges and opportunities of current bio/chemical sensors and analytical tools by critically analyzing the bottlenecks which have hindered the implementation of advanced sensing technologies in pandemic diseases We also describe in brief COVID-19 by comparing it with other pandemic strains such as that of severe acute respiratory syndrome (SARS) and Middle East respiratory syndrome (MERS) for the identification of features that enable biosensing Moreover, we discuss visualization and characterization tools that can potentially be used not only for sensing applications but also to assist in speeding up the drug discovery and vaccine development process Furthermore, we discuss the emerging monitoring mechanism, namely wastewater-based epidemiology, for early warning of the outbreak, focusing on sensors for rapid and on-site analysis of SARS-CoV2 in sewage To conclude, we provide holistic insights into challenges associated with the quick translation of sensing technologies, policies, ethical issues, technology adoption, and an overall outlook of the role of the sensing technologies in pandemics

277 citations

References
More filters
Book
16 Jul 1998
TL;DR: Thorough, well-organized, and completely up to date, this book examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks.
Abstract: From the Publisher: This book represents the most comprehensive treatment available of neural networks from an engineering perspective. Thorough, well-organized, and completely up to date, it examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks. Written in a concise and fluid manner, by a foremost engineering textbook author, to make the material more accessible, this book is ideal for professional engineers and graduate students entering this exciting field. Computer experiments, problems, worked examples, a bibliography, photographs, and illustrations reinforce key concepts.

29,130 citations

01 Jan 1967
TL;DR: The k-means algorithm as mentioned in this paper partitions an N-dimensional population into k sets on the basis of a sample, which is a generalization of the ordinary sample mean, and it is shown to give partitions which are reasonably efficient in the sense of within-class variance.
Abstract: The main purpose of this paper is to describe a process for partitioning an N-dimensional population into k sets on the basis of a sample. The process, which is called 'k-means,' appears to give partitions which are reasonably efficient in the sense of within-class variance. That is, if p is the probability mass function for the population, S = {S1, S2, * *, Sk} is a partition of EN, and ui, i = 1, 2, * , k, is the conditional mean of p over the set Si, then W2(S) = ff=ISi f z u42 dp(z) tends to be low for the partitions S generated by the method. We say 'tends to be low,' primarily because of intuitive considerations, corroborated to some extent by mathematical analysis and practical computational experience. Also, the k-means procedure is easily programmed and is computationally economical, so that it is feasible to process very large samples on a digital computer. Possible applications include methods for similarity grouping, nonlinear prediction, approximating multivariate distributions, and nonparametric tests for independence among several variables. In addition to suggesting practical classification methods, the study of k-means has proved to be theoretically interesting. The k-means concept represents a generalization of the ordinary sample mean, and one is naturally led to study the pertinent asymptotic behavior, the object being to establish some sort of law of large numbers for the k-means. This problem is sufficiently interesting, in fact, for us to devote a good portion of this paper to it. The k-means are defined in section 2.1, and the main results which have been obtained on the asymptotic behavior are given there. The rest of section 2 is devoted to the proofs of these results. Section 3 describes several specific possible applications, and reports some preliminary results from computer experiments conducted to explore the possibilities inherent in the k-means idea. The extension to general metric spaces is indicated briefly in section 4. The original point of departure for the work described here was a series of problems in optimal classification (MacQueen [9]) which represented special

24,320 citations


"Big data and deep data in scanning ..." refers background in this paper

  • ...where μi is the mean of points in Si, is minimized [49,50]....

    [...]

Journal ArticleDOI
22 Dec 2000-Science
TL;DR: Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
Abstract: Many areas of science depend on exploratory data analysis and visualization. The need to analyze large amounts of multivariate data raises the fundamental problem of dimensionality reduction: how to discover compact representations of high-dimensional data. Here, we introduce locally linear embedding (LLE), an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs. Unlike clustering methods for local dimensionality reduction, LLE maps its inputs into a single global coordinate system of lower dimensionality, and its optimizations do not involve local minima. By exploiting the local symmetries of linear reconstructions, LLE is able to learn the global structure of nonlinear manifolds, such as those generated by images of faces or documents of text.

15,106 citations


Additional excerpts

  • ..., are less scalable [99-102]....

    [...]

Journal ArticleDOI
22 Dec 2000-Science
TL;DR: An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.
Abstract: Scientists working with large volumes of high-dimensional data, such as global climate patterns, stellar spectra, or human gene distributions, regularly confront the problem of dimensionality reduction: finding meaningful low-dimensional structures hidden in their high-dimensional observations. The human brain confronts the same problem in everyday perception, extracting from its high-dimensional sensory inputs-30,000 auditory nerve fibers or 10(6) optic nerve fibers-a manageably small number of perceptually relevant features. Here we describe an approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set. Unlike classical techniques such as principal component analysis (PCA) and multidimensional scaling (MDS), our approach is capable of discovering the nonlinear degrees of freedom that underlie complex natural observations, such as human handwriting or images of a face under different viewing conditions. In contrast to previous algorithms for nonlinear dimensionality reduction, ours efficiently computes a globally optimal solution, and, for an important class of data manifolds, is guaranteed to converge asymptotically to the true structure.

13,652 citations


Additional excerpts

  • ..., are less scalable [99-102]....

    [...]

Journal ArticleDOI

11,905 citations