scispace - formally typeset
S

Sifan Wang

Researcher at University of Pennsylvania

Publications -  21
Citations -  2625

Sifan Wang is an academic researcher from University of Pennsylvania. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 8, co-authored 13 publications receiving 391 citations.

Papers
More filters
Journal ArticleDOI

Physics-informed machine learning

TL;DR: Some of the prevailing trends in embedding physics into machine learning are reviewed, some of the current capabilities and limitations are presented and diverse applications of physics-informed learning both for forward and inverse problems, including discovering hidden physics and tackling high-dimensional problems are discussed.
Posted Content

When and why PINNs fail to train: A neural tangent kernel perspective

TL;DR: A novel gradient descent algorithm is proposed that utilizes the eigenvalues of the NTK to adaptively calibrate the convergence rate of the total training error and a series of numerical experiments are performed to verify the correctness of the theory and the practical effectiveness of the proposed algorithms.
Journal ArticleDOI

Physics-Informed Neural Networks for Heat Transfer Problems

TL;DR: In this paper, physics-informed neural networks (PINNs) have been applied to various prototype heat transfer problems, targeting in particular realistic conditions not readily tackled with traditional computational methods.
Posted Content

Understanding and mitigating gradient pathologies in physics-informed neural networks

TL;DR: This work reviews recent advances in scientific machine learning with a specific focus on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data and proposes a novel neural network architecture that is more resilient to gradient pathologies.
Journal ArticleDOI

Understanding and Mitigating Gradient Flow Pathologies in Physics-Informed Neural Networks

TL;DR: The widespread use of neural networks across different scientific domains often involves constraining them to satisfy certain symmetries, conservation laws, or other domain knowledge as discussed by the authors, and such constrai...