A Transdisciplinary Review of Deep Learning Research and Its Relevance for Water Resources Scientists
TLDR
It is argued that DL can help address several major new and old challenges facing research in water sciences such as interdisciplinarity, data discoverability, hydrologic scaling, equifinality, and needs for parameter regionalization.About:
This article is published in Water Resources Research.The article was published on 2018-11-01 and is currently open access. It has received 501 citations till now. The article focuses on the topics: Transformative learning & Relevance (information retrieval).read more
Citations
More filters
Journal ArticleDOI
Rainfall-runoff modelling using Long Short-Term Memory (LSTM) networks
TL;DR: In this paper, a novel data-driven approach, using the Long Short-Term Memory (LSTM) network, a special type of recurrent neural network, was proposed for modeling storage effects in e.g. catchments with snow influence.
Journal ArticleDOI
A Brief Review of Random Forests for Water Scientists and Practitioners and Their Recent History in Water Resources
TL;DR: This work popularizes RF and their variants for the practicing water scientist, and discusses related concepts and techniques, which have received less attention from the water science and hydrologic communities.
Journal ArticleDOI
Artificial intelligence for sustainability: Challenges, opportunities, and a research agenda
TL;DR: It is argued that AI can support the derivation of culturally appropriate organizational processes and individual practices to reduce the natural resource and energy intensity of human activities and facilitate and fosters environmental governance.
Journal ArticleDOI
Perspectives on the Future of Land Surface Models and the Challenges of Representing Complex Terrestrial Systems
Rosie A. Fisher,Charles D. Koven +1 more
TL;DR: This review identifies three “grand challenges” in the development and use of LSMs, based around these issues: managing process complexity, representing land surface heterogeneity, and understanding parametric dynamics across the broad set of problems asked of LS Ms in a changing world.
Journal ArticleDOI
A review of machine learning applications in wildfire science and management
Piyush Jain,Piyush Jain,Sean C. P. Coogan,Sriram Subramanian,Mark Crowley,Steve W. Taylor,Mike D. Flannigan +6 more
TL;DR: Artificial intelligence has been applied in wildfire science and management since the 1990s, with early applications including neural networks and expert systems as discussed by the authors, and it has rapidly accelerated the field's development.
References
More filters
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI
A new look at the statistical model identification
TL;DR: In this article, a new estimate minimum information theoretical criterion estimate (MAICE) is introduced for the purpose of statistical identification, which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure.
Journal ArticleDOI
Deep learning
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Proceedings ArticleDOI
Going deeper with convolutions
Christian Szegedy,Wei Liu,Yangqing Jia,Pierre Sermanet,Scott Reed,Dragomir Anguelov,Dumitru Erhan,Vincent Vanhoucke,Andrew Rabinovich +8 more
TL;DR: Inception as mentioned in this paper is a deep convolutional neural network architecture that achieves the new state of the art for classification and detection in the ImageNet Large-Scale Visual Recognition Challenge 2014 (ILSVRC14).
Journal Article
Dropout: a simple way to prevent neural networks from overfitting
TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.