S
Simon Batzner
Researcher at Harvard University
Publications - 17
Citations - 874
Simon Batzner is an academic researcher from Harvard University. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 6, co-authored 9 publications receiving 180 citations. Previous affiliations of Simon Batzner include Massachusetts Institute of Technology.
Papers
More filters
Posted ContentDOI
SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials
Simon Batzner,Tess Smidt,Lixin Sun,Jonathan P. Mailoa,Mordechai Kornbluth,Nicola Molinari,Boris Kozinsky +6 more
TL;DR: The NequIP method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency, challenging the widely held belief that deep neural networks require massive training sets.
Journal ArticleDOI
On-the-fly active learning of interpretable Bayesian force fields for atomistic rare events
Jonathan Vandermause,Steven B. Torrisi,Simon Batzner,Simon Batzner,Yu Xie,Lixin Sun,Alexie M. Kolpak,Boris Kozinsky,Boris Kozinsky +8 more
TL;DR: In this paper, an adaptive Bayesian inference method for automating the training of interpretable, low-dimensional, and multi-element interatomic force fields using structures drawn on the fly from molecular dynamics simulations is presented.
Journal ArticleDOI
E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials
Boris Kozinsky,Simon Batzner +1 more
TL;DR: NequIP as mentioned in this paper is an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations, which achieves state-of-the-art accuracy on a challenging and diverse set of molecules and materials while exhibiting remarkable data efficiency.
Journal ArticleDOI
Learning local equivariant representations for large-scale atomistic dynamics
Albert Musaelian,Simon Batzner,Anders Johansson,Lixin Sun,Cameron J. Owen,Mordechai Kornbluth,Boris Kozinsky +6 more
TL;DR: Allegro as discussed by the authors represents a many-body potential using iterated tensor products of learned equivariant representations without atom-centered message passing and achieves state-of-the-art performance on QM9 and revMD17.
Journal ArticleDOI
A fast neural network approach for direct covariant forces prediction in complex multi-element extended systems
Jonathan P. Mailoa,Mordechai Kornbluth,Simon Batzner,Simon Batzner,Georgy Samsonidze,Stephen T. Lam,Stephen T. Lam,Jonathan Vandermause,Chris Ablitt,Chris Ablitt,Nicola Molinari,Nicola Molinari,Boris Kozinsky,Boris Kozinsky +13 more
TL;DR: A staggered NNFF architecture is shown that exploits both rotation-invariant and -covariant features to directly predict atomic force vectors without using spatial derivatives, and can also directly predict complex covariant vector outputs from local environments, in other domains beyond computational material science.