Open AccessPosted Content
Rotation Invariant Graph Neural Networks using Spin Convolutions
Muhammed Shuaibi,Adeesh Kolluru,Abhishek Das,Aditya Grover,Anuroop Sriram,Zachary W. Ulissi,C. Lawrence Zitnick +6 more
TLDR
In this paper, the angular information between sets of neighboring atoms in a graph neural network is modeled by using a per-edge local coordinate frame and a spin convolution over the remaining degree of freedom.Abstract:
Progress towards the energy breakthroughs needed to combat climate change can be significantly accelerated through the efficient simulation of atomic systems. Simulation techniques based on first principles, such as Density Functional Theory (DFT), are limited in their practical use due to their high computational expense. Machine learning approaches have the potential to approximate DFT in a computationally efficient manner, which could dramatically increase the impact of computational simulations on real-world problems. Approximating DFT poses several challenges. These include accurately modeling the subtle changes in the relative positions and angles between atoms, and enforcing constraints such as rotation invariance or energy conservation. We introduce a novel approach to modeling angular information between sets of neighboring atoms in a graph neural network. Rotation invariance is achieved for the network's edge messages through the use of a per-edge local coordinate frame and a novel spin convolution over the remaining degree of freedom. Two model variants are proposed for the applications of structure relaxation and molecular dynamics. State-of-the-art results are demonstrated on the large-scale Open Catalyst 2020 dataset. Comparisons are also performed on the MD17 and QM9 datasets.read more
Citations
More filters
Posted Content
GemNet: Universal Directional Graph Neural Networks for Molecules
TL;DR: GemNet as discussed by the authors uses two-hop message passing to predict molecular interactions and achieves state-of-the-art performance on the COLL and MD17 molecular dynamics datasets.
Posted Content
Crystal Diffusion Variational Autoencoder for Periodic Material Generation
TL;DR: In this paper, the authors propose a crystal diffusion variational autoencoder (CDVAE) that captures the physical inductive bias of material stability by learning from the data distribution of stable materials.
Posted Content
3D-Transformer: Molecular Representation with Transformer in 3D Space
TL;DR: 3D-Transformer as mentioned in this paper proposes a multi-scale self-attention module that exploits local fine-grained patterns with increasing contextual scales to deal with the non-uniformity of interatomic distances.
Posted Content
Frame Averaging for Invariant and Equivariant Network Design.
TL;DR: Frame Averaging (FA) as discussed by the authors is a general purpose and systematic framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
References
More filters
Journal ArticleDOI
Towards exact molecular dynamics simulations with machine-learned force fields.
Stefan Chmiela,Huziel E. Sauceda,Klaus-Robert Müller,Klaus-Robert Müller,Klaus-Robert Müller,Alexandre Tkatchenko +5 more
TL;DR: A flexible machine-learning force-field with high-level accuracy for molecular dynamics simulations is developed, for flexible molecules with up to a few dozen atoms and insights into the dynamical behavior of these molecules are provided.
Posted Content
Tackling Climate Change with Machine Learning
David Rolnick,Priya L. Donti,Lynn H. Kaack,K. Kochanski,Alexandre Lacoste,Kris Sankaran,Andrew S. Ross,Nikola Milojevic-Dupont,Natasha Jaques,Anna Waldman-Brown,Alexandra Luccioni,Tegan Maharaj,Evan D. Sherwin,S. Karthik Mukkavilli,Konrad P. Kording,Carla P. Gomes,Andrew Y. Ng,Demis Hassabis,John Platt,Felix Creutzig,Jennifer Chayes,Yoshua Bengio +21 more
TL;DR: From smart grids to disaster management, high impact problems where existing gaps can be filled by ML are identified, in collaboration with other fields, to join the global effort against climate change.
Journal ArticleDOI
Mixture of experts: a literature survey
Saeed Masoudnia,Reza Ebrahimpour +1 more
TL;DR: A categorisation of the ME literature based on the implicit problem space partitioning using a tacit competitive process between the experts is presented, and the first group is called the mixture of implicitly localised experts (MILE), and the second is called mixture of explicitly localised Experts (MELE), as it uses pre-specified clusters.
Posted ContentDOI
SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials
Simon Batzner,Tess Smidt,Lixin Sun,Jonathan P. Mailoa,Mordechai Kornbluth,Nicola Molinari,Boris Kozinsky +6 more
TL;DR: The NequIP method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency, challenging the widely held belief that deep neural networks require massive training sets.
Proceedings Article
SchNet: A continuous-filter convolutional neural network for modeling quantum interactions
Kristof T. Schütt,Pieter-Jan Kindermans,Huziel E. Sauceda,Stefan Chmiela,Alexandre Tkatchenko,Klaus-Robert Müller +5 more
TL;DR: This work proposes to use continuous-filter convolutional layers to be able to model local correlations without requiring the data to lie on a grid, and obtains a joint model for the total energy and interatomic forces that follows fundamental quantum-chemical principles.
Related Papers (5)
Machine Learning Adaptive Basis Sets for Efficient Large Scale Density Functional Theory Simulation.
Ole Schütt,Joost VandeVondele +1 more