scispace - formally typeset
A

Ameya D. Jagtap

Researcher at Brown University

Publications -  30
Citations -  1754

Ameya D. Jagtap is an academic researcher from Brown University. The author has contributed to research in topics: Artificial neural network & Nonlinear system. The author has an hindex of 7, co-authored 21 publications receiving 503 citations. Previous affiliations of Ameya D. Jagtap include Tata Institute of Fundamental Research & TIFR Centre for Applicable Mathematics.

Papers
More filters
Journal ArticleDOI

Physics-informed neural networks for high-speed flows

TL;DR: In this article, a physics-informed neural network (PINN) was used to approximate the Euler equations that model high-speed aerodynamic flows in one-dimensional and two-dimensional domains.
Journal ArticleDOI

Adaptive activation functions accelerate convergence in deep and physics-informed neural networks

TL;DR: It is theoretically proved that in the proposed method, gradient descent algorithms are not attracted to suboptimal critical points or local minima, and the proposed adaptive activation functions are shown to accelerate the minimization process of the loss values in standard deep learning benchmarks with and without data augmentation.
Journal ArticleDOI

Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems

TL;DR: In cPINN, locally adaptive activation functions are used, hence training the model faster compared to its fixed counterparts, and it efficiently lends itself to parallelized computation, where each sub-domain can be assigned to a different computational node.
Journal ArticleDOI

Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks.

TL;DR: It is proved that in the proposed method, the gradient descent algorithms are not attracted to sub-optimal critical points or local minima under practical conditions on the initialization and learning rate, and that the gradient dynamics of the proposedmethod is not achievable by base methods with any (adaptive) learning rates.
Journal ArticleDOI

Parallel physics-informed neural networks via domain decomposition

TL;DR: In this article, a distributed framework for physics-informed neural networks (PINNs) based on two recent extensions, namely conservative PINNs and extended PINNs (XPINNs), which employ domain decomposition in space and in time-space, respectively, is developed.