scispace - formally typeset
Open AccessJournal ArticleDOI

Could Machine Learning Break the Convection Parameterization Deadlock

TLDR
A novel approach to convective parameterization based on machine learning is presented, using an aquaplanet with prescribed sea surface temperatures as a proof of concept to show that neural networks trained on a high-resolution model in which moist convection is resolved can be an appealing technique to tackle and better represent moist convections in coarse resolution climate models.
Abstract
Modeling and representing moist convection in coarse-scale climate models remains one of the main bottlenecks of current climate simulations. Many of the biases present with parameterized convection are strongly reduced when convection is explicitly resolved (in cloud resolving models at high spatial resolution ~ a kilometer or so). We here present a novel approach to convective parameterization based on machine learning over an aquaplanet with prescribed sea surface temperatures. The machine learning is trained over a superparameterized version of a climate model in which convection is resolved by an embedded 2D cloud resolving models. The machine learning representation of convection, called Cloud Brain (CBRAIN) replicates many of the convective features of the superparameterized climate model, yet reduces its inherent stochasticity. The approach presented here opens up a new possibility and a first step towards better representing convection in climate models and reducing uncertainties in climate predictions.

read more

Citations
More filters
Journal ArticleDOI

Deep learning and process understanding for data-driven Earth system science

TL;DR: It is argued that contextual cues should be used as part of deep learning to gain further process understanding of Earth system science problems, improving the predictive ability of seasonal forecasting and modelling of long-range spatial connections across multiple timescales.

Global patterns of land-atmosphere fluxes of carbon dioxide, latent heat, and sensible heat derived from eddy covariance, satellite, and meteorological observations

Abstract: We upscaled FLUXNET observations of carbon dioxide, water, and energy fluxes to the global scale using the machine learning technique, model tree ensembles (MTE). We trained MTE to predict site-level gross primary productivity (GPP), terrestrial ecosystem respiration (TER), net ecosystem exchange (NEE), latent energy (LE), and sensible heat (H) based on remote sensing indices, climate and meteorological data, and information on land use. We applied the trained MTEs to generate global flux fields at a 0.5 degrees x 0.5 degrees spatial resolution and a monthly temporal resolution from 1982 to 2008. Cross-validation analyses revealed good performance of MTE in predicting among-site flux variability with modeling efficiencies (MEf) between 0.64 and 0.84, except for NEE (MEf = 0.32). Performance was also good for predicting seasonal patterns (MEf between 0.84 and 0.89, except for NEE (0.64)). By comparison, predictions of monthly anomalies were not as strong (MEf between 0.29 and 0.52). Improved accounting of disturbance and lagged environmental effects, along with improved characterization of errors in the training data set, would contribute most to further reducing uncertainties. Our global estimates of LE (158 +/- 7 J x 10(18) yr(-1)), H (164 +/- 15 J x 10(18) yr(-1)), and GPP (119 +/- 6 Pg C yr(-1)) were similar to independent estimates. Our global TER estimate (96 +/- 6 Pg C yr(-1)) was likely underestimated by 5-10%. Hot spot regions of interannual variability in carbon fluxes occurred in semiarid to semihumid regions and were controlled by moisture supply. Overall, GPP was more important to interannual variability in NEE than TER. Our empirically derived fluxes may be used for calibration and evaluation of land surface process models and for exploratory and diagnostic assessments of the biosphere.
Journal ArticleDOI

Deep learning to represent subgrid processes in climate models.

TL;DR: In this paper, a deep neural network is used to represent all atmospheric subgrid processes in a climate model by learning from a multiscale model in which convection is treated explicitly.
Journal ArticleDOI

A Transdisciplinary Review of Deep Learning Research and Its Relevance for Water Resources Scientists

TL;DR: It is argued that DL can help address several major new and old challenges facing research in water sciences such as interdisciplinarity, data discoverability, hydrologic scaling, equifinality, and needs for parameter regionalization.
References
More filters
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Book

Deep Learning

TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Journal ArticleDOI

Mastering the game of Go with deep neural networks and tree search

TL;DR: Using this search algorithm, the program AlphaGo achieved a 99.8% winning rate against other Go programs, and defeated the human European Go champion by 5 games to 0.5, the first time that a computer program has defeated a human professional player in the full-sized game of Go.
Related Papers (5)