Machine learning strategies for systems with invariance properties
TLDR
This paper will specifically address physical systems that possess symmetry or invariance properties and shows that in both cases embedding the invariance property into the input features yields higher performance at significantly reduced computational training costs.About:
This article is published in Journal of Computational Physics.The article was published on 2016-08-01 and is currently open access. It has received 327 citations till now. The article focuses on the topics: Online machine learning & Computational learning theory.read more
Citations
More filters
Journal ArticleDOI
Reynolds averaged turbulence modelling using deep neural networks with embedded invariance
TL;DR: This paper presents a method of using deep neural networks to learn a model for the Reynolds stress anisotropy tensor from high-fidelity simulation data and proposes a novel neural network architecture which uses a multiplicative layer with an invariant tensor basis to embed Galilean invariance into the predicted anisotropic tensor.
Journal ArticleDOI
Machine Learning for Fluid Mechanics
TL;DR: An overview of machine learning for fluid mechanics can be found in this article, where the strengths and limitations of these methods are addressed from the perspective of scientific inquiry that considers data as an inherent part of modeling, experimentation, and simulation.
Journal ArticleDOI
Turbulence Modeling in the Age of Data
TL;DR: In this paper, a review of recent developments in bounding uncertainties in RANS models via physical constraints, in adopting statistical inference to characterize model coefficients and estimate discrepancy, and in using machine learning to improve turbulence models.
Journal ArticleDOI
Physics-informed machine learning approach for reconstructing Reynolds stress modeling discrepancies based on DNS data
TL;DR: In this paper, a physics-informed machine learning framework is proposed to improve the predictive capabilities of RANS models by leveraging existing direct numerical simulations databases, and the discrepancies in Reynolds-averaged Navier-Stokes (RANS) modeled Reynolds stresses can be explained by mean flow features.
Journal ArticleDOI
Explainable Machine Learning for Scientific Insights and Discoveries
TL;DR: In this paper, the authors provide a survey of recent scientific works that incorporate machine learning and the way that explainable machine learning is used in combination with domain knowledge from the application areas.
References
More filters
Journal ArticleDOI
Random Forests
TL;DR: Internal estimates monitor error, strength, and correlation and these are used to show the response to increasing the number of features used in the forest, and are also applicable to regression.
Journal Article
Scikit-learn: Machine Learning in Python
Fabian Pedregosa,Gaël Varoquaux,Alexandre Gramfort,Vincent Michel,Bertrand Thirion,Olivier Grisel,Mathieu Blondel,Peter Prettenhofer,Ron Weiss,Vincent Dubourg,Jake Vanderplas,Alexandre Passos,David Cournapeau,Matthieu Brucher,Matthieu Perrot,Edouard Duchesnay +15 more
TL;DR: Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems, focusing on bringing machine learning to non-specialists using a general-purpose high-level language.
Posted Content
Scikit-learn: Machine Learning in Python
Fabian Pedregosa,Gaël Varoquaux,Alexandre Gramfort,Vincent Michel,Bertrand Thirion,Olivier Grisel,Mathieu Blondel,Andreas Müller,Joel Nothman,Gilles Louppe,Peter Prettenhofer,Ron Weiss,Vincent Dubourg,Jake Vanderplas,Alexandre Passos,David Cournapeau,Matthieu Brucher,Matthieu Perrot,Edouard Duchesnay +18 more
TL;DR: Scikit-learn as mentioned in this paper is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems.
Book
Neural networks for pattern recognition
TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Book ChapterDOI
Neural Networks for Pattern Recognition
Suresh Kothari,Heekuck Oh +1 more
TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Related Papers (5)
Reynolds averaged turbulence modelling using deep neural networks with embedded invariance
A paradigm for data-driven predictive modeling using field inversion and machine learning
Eric J. Parish,Karthik Duraisamy +1 more