scispace - formally typeset
A

Ameya D. Jagtap

Researcher at Brown University

Publications -  30
Citations -  1754

Ameya D. Jagtap is an academic researcher from Brown University. The author has contributed to research in topics: Artificial neural network & Nonlinear system. The author has an hindex of 7, co-authored 21 publications receiving 503 citations. Previous affiliations of Ameya D. Jagtap include Tata Institute of Fundamental Research & TIFR Centre for Applicable Mathematics.

Papers
More filters
Posted Content

Method of Relaxed Streamline Upwinding for Hyperbolic Conservation Laws

TL;DR: In this paper, a new finite element based method of Relaxed Streamline Upwinding is proposed to solve hyperbolic conservation laws and the proposed scheme is based on relaxation system which replaces hyperbola conservation laws by semi-linear system with stiff source term also called as relaxation term.
Posted Content

Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

Abstract: We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions. KNNs employ the Kronecker product, which provides an efficient way of constructing a very wide network while keeping the number of parameters low. Our theoretical analysis reveals that under suitable conditions, KNNs induce a faster decay of the loss than that by the feed-forward networks. This is also empirically verified through a set of computational examples. Furthermore, under certain technical assumptions, we establish global convergence of gradient descent for KNNs. As a specific case, we propose the Rowdy activation function that is designed to get rid of any saturation region by injecting sinusoidal fluctuations, which include trainable parameters. The proposed Rowdy activation function can be employed in any neural network architecture like feed-forward neural networks, Recurrent neural networks, Convolutional neural networks etc. The effectiveness of KNNs with Rowdy activation is demonstrated through various computational experiments including function approximation using feed-forward neural networks, solution inference of partial differential equations using the physics-informed neural networks, and standard deep learning benchmark problems using convolutional and fully-connected neural networks.
Journal ArticleDOI

Higher order spectral element scheme for two- and three-dimensional Cahn–Hilliard equation

TL;DR: In this article, a semi-implicit time stepping numerical scheme which is unconditionally gradient stable is used for the numerical simulation of Cahn-Hilliard equation in two and three dimensions.
Posted Content

Explicit and Implicit Kinetic Streamlined-Upwind Petrov Galerkin Method for Hyperbolic Partial Differential Equations

TL;DR: In this paper, a novel explicit and implicit Kinetic Streamlined-Upwind Petrov Galerkin (KSUPG) scheme is presented for hyperbolic equations such as Burgers equation and compressible Euler equations.