scispace - formally typeset
Open AccessPosted Content

Rotation Invariant Graph Neural Networks using Spin Convolutions

TLDR
In this paper, the angular information between sets of neighboring atoms in a graph neural network is modeled by using a per-edge local coordinate frame and a spin convolution over the remaining degree of freedom.
Abstract
Progress towards the energy breakthroughs needed to combat climate change can be significantly accelerated through the efficient simulation of atomic systems. Simulation techniques based on first principles, such as Density Functional Theory (DFT), are limited in their practical use due to their high computational expense. Machine learning approaches have the potential to approximate DFT in a computationally efficient manner, which could dramatically increase the impact of computational simulations on real-world problems. Approximating DFT poses several challenges. These include accurately modeling the subtle changes in the relative positions and angles between atoms, and enforcing constraints such as rotation invariance or energy conservation. We introduce a novel approach to modeling angular information between sets of neighboring atoms in a graph neural network. Rotation invariance is achieved for the network's edge messages through the use of a per-edge local coordinate frame and a novel spin convolution over the remaining degree of freedom. Two model variants are proposed for the applications of structure relaxation and molecular dynamics. State-of-the-art results are demonstrated on the large-scale Open Catalyst 2020 dataset. Comparisons are also performed on the MD17 and QM9 datasets.

read more

Citations
More filters
Posted Content

GemNet: Universal Directional Graph Neural Networks for Molecules

TL;DR: GemNet as discussed by the authors uses two-hop message passing to predict molecular interactions and achieves state-of-the-art performance on the COLL and MD17 molecular dynamics datasets.
Posted Content

Crystal Diffusion Variational Autoencoder for Periodic Material Generation

TL;DR: In this paper, the authors propose a crystal diffusion variational autoencoder (CDVAE) that captures the physical inductive bias of material stability by learning from the data distribution of stable materials.
Posted Content

3D-Transformer: Molecular Representation with Transformer in 3D Space

TL;DR: 3D-Transformer as mentioned in this paper proposes a multi-scale self-attention module that exploits local fine-grained patterns with increasing contextual scales to deal with the non-uniformity of interatomic distances.
Posted Content

Frame Averaging for Invariant and Equivariant Network Design.

TL;DR: Frame Averaging (FA) as discussed by the authors is a general purpose and systematic framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
References
More filters
Journal ArticleDOI

Quantum chemistry structures and properties of 134 kilo molecules

TL;DR: This data set provides quantum chemical properties for a relevant, consistent, and comprehensive chemical space of small organic molecules that may serve the benchmarking of existing methods, development of new methods, such as hybrid quantum mechanics/machine learning, and systematic identification of structure-property relationships.
Journal ArticleDOI

Graph Neural Networks: A Review of Methods and Applications

TL;DR: In this paper, the authors propose a general design pipeline for GNN models and discuss the variants of each component, systematically categorize the applications, and propose four open problems for future research.
Journal ArticleDOI

Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties.

TL;DR: A crystal graph convolutional neural networks framework to directly learn material properties from the connection of atoms in the crystal, providing a universal and interpretable representation of crystalline materials.
Journal ArticleDOI

SchNet - A deep learning architecture for molecules and materials.

TL;DR: SchNet as mentioned in this paper is a deep learning architecture specifically designed to model atomistic systems by making use of continuous-filter convolutional layers, where the model learns chemically plausible embeddings of atom types across the periodic table.
Journal ArticleDOI

Quantum-chemical insights from deep tensor neural networks.

TL;DR: In this article, a deep tensor neural network is used to predict atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure.
Related Papers (5)