scispace - formally typeset
Open AccessJournal ArticleDOI

SchNet - A deep learning architecture for molecules and materials.

Reads0
Chats0
TLDR
SchNet as mentioned in this paper is a deep learning architecture specifically designed to model atomistic systems by making use of continuous-filter convolutional layers, where the model learns chemically plausible embeddings of atom types across the periodic table.
Abstract
Deep learning has led to a paradigm shift in artificial intelligence, including web, text, and image search, speech recognition, as well as bioinformatics, with growing impact in chemical physics. Machine learning, in general, and deep learning, in particular, are ideally suitable for representing quantum-mechanical interactions, enabling us to model nonlinear potential-energy surfaces or enhancing the exploration of chemical compound space. Here we present the deep learning architecture SchNet that is specifically designed to model atomistic systems by making use of continuous-filter convolutional layers. We demonstrate the capabilities of SchNet by accurately predicting a range of properties across chemical space for molecules and materials, where our model learns chemically plausible embeddings of atom types across the periodic table. Finally, we employ SchNet to predict potential-energy surfaces and energy-conserving force fields for molecular dynamics simulations of small molecules and perform an exemplary study on the quantum-mechanical properties of C20-fullerene that would have been infeasible with regular ab initio molecular dynamics.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Machine learning and the physical sciences

TL;DR: This article reviews in a selective way the recent research on the interface between machine learning and the physical sciences, including conceptual developments in ML motivated by physical insights, applications of machine learning techniques to several domains in physics, and cross fertilization between the two fields.
Journal ArticleDOI

Recent advances and applications of machine learning in solid-state materials science

TL;DR: A comprehensive overview and analysis of the most recent research in machine learning principles, algorithms, descriptors, and databases in materials science, and proposes solutions and future research paths for various challenges in computational materials science.
Journal Article

Quantum-Chemical Insights from Deep Tensor Neural Networks

TL;DR: An efficient deep learning approach is developed that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems, and unifies concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate chemical space predictions.
Journal ArticleDOI

Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals

TL;DR: This work develops, for the first time, universal MatErials Graph Network (MEGNet) models for accurate property prediction in both molecules and crystals and demonstrates the transfer learning of elemental embeddings from a property model trained on a larger data set to accelerate the training of property models with smaller amounts of data.
Journal ArticleDOI

A Transdisciplinary Review of Deep Learning Research and Its Relevance for Water Resources Scientists

TL;DR: It is argued that DL can help address several major new and old challenges facing research in water sciences such as interdisciplinarity, data discoverability, hydrologic scaling, equifinality, and needs for parameter regionalization.
References
More filters
Journal ArticleDOI

Generalized Gradient Approximation Made Simple

TL;DR: A simple derivation of a simple GGA is presented, in which all parameters (other than those in LSD) are fundamental constants, and only general features of the detailed construction underlying the Perdew-Wang 1991 (PW91) GGA are invoked.
Journal ArticleDOI

Backpropagation applied to handwritten zip code recognition

TL;DR: This paper demonstrates how constraints from the task domain can be integrated into a backpropagation network through the architecture of the network, successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service.
Journal ArticleDOI

Commentary: The Materials Project: A materials genome approach to accelerating materials innovation

TL;DR: The Materials Project (www.materialsproject.org) is a core program of the Materials Genome Initiative that uses high-throughput computing to uncover the properties of all known inorganic materials as discussed by the authors.
Journal ArticleDOI

Accurate molecular van der Waals interactions from ground-state electron density and free-atom reference data

TL;DR: It is shown that the effective atomic C6 coefficients depend strongly on the bonding environment of an atom in a molecule, and the van der Waals radii and the damping function in the C6R(-6) correction method for density-functional theory calculations.
Journal ArticleDOI

On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation.

TL;DR: This work proposes a general solution to the problem of understanding classification decisions by pixel-wise decomposition of nonlinear classifiers by introducing a methodology that allows to visualize the contributions of single pixels to predictions for kernel-based classifiers over Bag of Words features and for multilayered neural networks.
Related Papers (5)
Frequently Asked Questions (16)
Q1. What is the effect of the atom embeddings on the properties of graphite?

Since the initial atom embeddings are obviously equivariant to the order of atoms, atom-wise layers are independently applied to each atom and continuous-filter convolutions sum over all neighboring atoms, indexing equivariance is retained in the atom-wise representations. 

Note that while the single-atom filters are circular due to the rotational invariance, the periodic filters become rotationally equivariant with respect to the orientation of the lattice, which still keeps the property prediction rotationally invariant. 

SchNet this paper is a variant of DTNNs that can learn a representation from first principles that adapts to the task and scale at hand from property prediction across chemical compound space to force fields in the configurational space of single molecules. 

This gives rise to the possibility to encode known quantumchemical constraints and symmetries within the model without losing the flexibility of a neural network. These encouraging results will guide future work such as studies of larger molecules and periodic systems as well as further developments toward interpretable deep learning architectures to assist chemistry research. This is crucial in order to be able to accurately represent, e. g., the full potential-energy surface and, in particular, its anharmonic behavior. The authors have presented the deep learning architecture SchNet which can be applied to a variety of applications ranging from the prediction of chemical properties for diverse datasets of molecules and materials to highly accurate predictions of potential energy surfaces and energy-conserving force fields. 

2. In solids, such local chemical potentials could be used to understand the formation and distribution of defects, such as vacancies and interstitials. 

The filter-generating network determines how interactions between atoms are modeled and can be used to constrainthe model and include chemical knowledge. 

Since there are a maximum number of atoms being located within a given cutoff, the computational cost of a training step scales linearly with the system size if the authors precompute the indices of nearby atoms. 

the model requires much less epochs to converge, e.g., using 110k training examples reduces the required number of epochs from 2400 with T = 2 to less than 750 epochs with T = 6. 

Due to the linearity of the convolution, the authors are, therefore, able to apply the PBCs directly to the filter to accurately describe the atom interactions while keeping invariance to the choice of the unit cell. 

The authors use 20k C20 reference calculations as the training set, 4.5k examples for early stopping, and report the test error on the remaining data. 

The authors employ SchNet to predict formation energies for bulk crystals using 69 640 structures and reference calculations from the Materials Project (MP) repository. 

The authors find that the training is more stable when normalizing the filter response xl+1i by the number of atoms within the cutoff range. 

SchNet performs better while using the combined loss with energies and forces on 1000 reference calculations than training on energies of 50 000 examples. 

While the authors have followed a data-driven approach where the authors only incorporate basic invariances in the filters, careful design of the filter-generating network provides the possibility to incorporate further chemical knowledge in the network. 

The distributions of the errors of all predicted properties are shown in Appendix A. Extending SchNet with interpretable, property-specific output layers, e.g., for the dipole moment,57 is subject to future work. 

the data are randomly split into 60 000 training examples, a validation set of 4500 examples and the remaining data as test set. 

Trending Questions (1)
What is the input to the SchNet model?

The input to the SchNet model is the vector pointing from atom i to its neighbor j.