scispace - formally typeset
Journal ArticleDOI

Progress Variable Variance and Filtered Rate Modelling Using Convolutional Neural Networks and Flamelet Methods

Reads0
Chats0
TLDR
A purely data-driven modelling approach using deep convolutional neural networks is discussed in the context of Large Eddy Simulation (LES) of turbulent premixed flames, demonstrating with success for both the sub-grid scale progress variable variance and the filtered reaction rate.
Abstract
A purely data-driven modelling approach using deep convolutional neural networks is discussed in the context of Large Eddy Simulation (LES) of turbulent premixed flames. The assessment of the method is conducted a priori using direct numerical simulation data. The network has been trained to perform deconvolution on the filtered density and the filtered density-progress variable product, and by doing so obtain estimates of the un-filtered progress variable field. A filtered function of the progress variable can then be approximated on the LES mesh using the deconvoluted field. This new strategy for tackling turbulent combustion modelling is demonstrated with success for both the sub-grid scale progress variable variance and the filtered reaction rate, using flamelet methods, two fundamental ingredients of premixed turbulent combustion modelling.

read more

Citations
More filters
Journal ArticleDOI

Criteria to switch from tabulation to neural networks in computational combustion

TL;DR: In this article , scaling laws for the computational cost of look-up tables and of neural networks including the effect of network structure are proposed. But the authors do not consider how to evaluate the performance of neural network against tabulation.
Book ChapterDOI

From discrete and iterative deconvolution operators to machine learning for premixed turbulent combustion modeling.

TL;DR: The analysis confirms the potential of deconvolution to approximate the unclosed non-linear terms and the SGS fluxes and the introduction of machine learning in turbulent combustion modeling is illustrated in the context of convolutional neural networks.
Book ChapterDOI

Deep Convolutional Neural Networks for Subgrid-Scale Flame Wrinkling Modeling

Amy Burrell
TL;DR: In this paper , a deep convolutional neural network called a U-Net is trained to predict the total flame surface density from the resolved progress variable, and the network outperforms classical dynamic models.
Journal ArticleDOI

A priori analysis on deep learning of subgrid-scale parameterizations for Kraichnan turbulence

TL;DR: In this article, the authors investigate different data-driven parameterizations for large eddy simulation of two-dimensional turbulence in the \emph{a priori} settings, which utilize resolved flow field variables on the coarser grid to estimate the subgrid-scale stresses.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Book

Deep Learning

TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Journal ArticleDOI

Human-level control through deep reinforcement learning

TL;DR: This work bridges the divide between high-dimensional sensory inputs and actions, resulting in the first artificial agent that is capable of learning to excel at a diverse array of challenging tasks.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Related Papers (5)