Learning the solution operator of parametric partial differential equations with physics-informed DeepONets.
Reads0
Chats0
TLDR
DeepONets as discussed by the authors is a deep learning framework for learning the solution operator of arbitrary PDEs, even in the absence of any paired input-output training data, and demonstrates the effectiveness of the proposed framework in rapidly predicting the solution of various types of parametric PDE, up to three orders of magnitude faster compared to conventional PDE solvers.Abstract:
Partial differential equations (PDEs) play a central role in the mathematical analysis and modeling of complex dynamic processes across all corners of science and engineering. Their solution often requires laborious analytical or computational tools, associated with a cost that is markedly amplified when different scenarios need to be investigated, for example, corresponding to different initial or boundary conditions, different inputs, etc. In this work, we introduce physics-informed DeepONets, a deep learning framework for learning the solution operator of arbitrary PDEs, even in the absence of any paired input-output training data. We illustrate the effectiveness of the proposed framework in rapidly predicting the solution of various types of parametric PDEs up to three orders of magnitude faster compared to conventional PDE solvers, setting a previously unexplored paradigm for modeling and simulation of nonlinear and nonequilibrium processes in science and engineering.read more
Citations
More filters
Journal ArticleDOI
MIONet: Learning multiple-input operators via tensor product
Pengzhan Jin,Shuai Meng,Lu Lu +2 more
TL;DR: A universal approximation theorem of continuous multiple-input operators is proved and a novel neural operator, MIONet, is proposed, which can learn solution operators involving systems governed by ordinary and partial differential equations.
Journal ArticleDOI
Interfacing Finite Elements with Deep Neural Operators for Fast Multiscale Modeling of Mechanics Problems
TL;DR: In this paper , the authors explore the idea of multiscale modeling with machine learning and employ DeepONet, a neural operator, as an efficient surrogate of the expensive solver.
Journal ArticleDOI
Learning two-phase microstructure evolution using neural operators and autoencoder architectures
TL;DR: DeepONet as mentioned in this paper integrates a convolutional autoencoder architecture with a deep neural operator to learn the dynamic evolution of a two-phase mixture and accelerate time-to-solution in predicting the microstructure evolution.
Journal ArticleDOI
Transfer learning based physics-informed neural networks for solving inverse problems in engineering structures under different loading scenarios
TL;DR: In this paper , a multi-task learning method using uncertainty weighting was proposed to improve the training efficiency and accuracy of PINNs for inverse problems in linear elasticity and hyperelasticity.
Journal ArticleDOI
Reliable extrapolation of deep neural operators informed by physics or sparse observations
TL;DR: In this paper , the authors investigate the extrapolation behavior of DeepONets by quantifying the 2-Wasserstein distance between two function spaces and propose a new strategy of bias-variance trade-off for extrapolation with respect to model capacity.
References
More filters
Posted Content
Adam: A Method for Stochastic Optimization
Diederik P. Kingma,Jimmy Ba +1 more
TL;DR: In this article, the adaptive estimates of lower-order moments are used for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimate of lowerorder moments.
Journal ArticleDOI
Matplotlib: A 2D Graphics Environment
TL;DR: Matplotlib is a 2D graphics package used for Python for application development, interactive scripting, and publication-quality image generation across user interfaces and operating systems.
Journal ArticleDOI
Array programming with NumPy
Charles R. Harris,K. Jarrod Millman,Stefan van der Walt,Stefan van der Walt,Ralf Gommers,Pauli Virtanen,David Cournapeau,Eric Wieser,Julian Taylor,Sebastian Berg,Nathaniel J. Smith,Robert Kern,Matti Picus,Stephan Hoyer,Marten H. van Kerkwijk,Matthew Brett,Matthew Brett,Allan Haldane,Jaime Fernández del Río,Mark Wiebe,Mark Wiebe,Pearu Peterson,Pierre Gérard-Marchant,Kevin Sheppard,Tyler Reddy,Warren Weckesser,Hameer Abbasi,Christoph Gohlke,Travis E. Oliphant +28 more
TL;DR: In this paper, the authors review how a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data, and their evolution into a flexible interoperability layer between increasingly specialized computational libraries is discussed.
Journal ArticleDOI
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
TL;DR: In this article, the authors introduce physics-informed neural networks, which are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinear partial differential equations.
Related Papers (5)
Physics-informed learning of governing equations from scarce data.
A unique transformation from ordinary differential equations to reaction networks.
Sylvain Soliman,Monika Heiner +1 more