Deep learning for geophysics: Current and future trends
Siwei Yu,Jianwei Ma +1 more
Reads0
Chats0
TLDR
A new data-driven technique, i.e., deep learning (DL), has attracted significantly increasing attention in the geophysical community and the collision of DL and traditional methods has had an impact on traditional methods.Abstract:
Recently deep learning (DL), as a new data-driven technique compared to conventional approaches, has attracted increasing attention in geophysical community, resulting in many opportunities and challenges. DL was proven to have the potential to predict complex system states accurately and relieve the “curse of dimensionality” in large temporal and spatial geophysical applications. We address the basic concepts, state-of-the-art literature, and future trends by reviewing DL approaches in various geosciences scenarios. Exploration geophysics, earthquakes, and remote sensing are the main focuses. More applications, including Earth structure, water resources, atmospheric science, and space science, are also reviewed. Additionally, the difficulties of applying DL in the geophysical community are discussed. The trends of DL in geophysics in recent years are analyzed. Several promising directions are provided for future research involving DL in geophysics, such as unsupervised learning, transfer learning, multimodal DL, federated learning, uncertainty estimation, and active learning. A coding tutorial and a summary of tips for rapidly exploring DL are presented for beginners and interested readers of geophysics.read more
Citations
More filters
Journal ArticleDOI
Deep learning for velocity model building with common-image gather volumes
TL;DR: In this paper, a deep learning technique is proposed to apply to construct subsurface velocity models automatically from common-image gather (CIG) volumes to train a convolutional neural network (CNN).
Journal ArticleDOI
S-wave velocity inversion and prediction using a deep hybrid neural network
Journal ArticleDOI
NNetEn2D: Two-Dimensional Neural Network Entropy in Remote Sensing Imagery and Geophysical Mapping
TL;DR: In this article , the authors proposed a new method for estimating two-dimensional neural network entropy (NNetEn2D) for evaluating the regularity or predictability of images using the LogNNet neural network model.
Journal ArticleDOI
Labeling Poststorm Coastal Imagery for Machine Learning: Measurement of Interrater Agreement
Evan B. Goldstein,Daniel Buscombe,Eli D. Lazarus,Somya D. Mohanty,Shah Nafis Rafique,K. Anarde,Andrew D. Ashton,Tomas Beuzen,Katherine A. Castagno,N. Cohn,Matthew P. Conlin,Ashley Ellenson,Megan Gillen,Paige A. Hovenga,Jin-Si R. Over,R. Palermo,K. M. Ratliff,I. R. B. Reeves,Lily H. Sanborn,Jessamin A. Straub,Luke A. Taylor,E. J. Wallace,Jonathan A. Warrick,Phillipe A. Wernette,Hannah Williams +24 more
Abstract: Classifying images using supervised machine learning (ML) relies on labeled training data—classes or text descriptions, for example, associated with each image. Data-driven models are only as good as the data used for training, and this points to the importance of high-quality labeled data for developing a ML model that has predictive skill. Labeling data is typically a time-consuming, manual process. Here, we investigate the process of labeling data, with a specific focus on coastal aerial imagery captured in the wake of hurricanes that affected the Atlantic and Gulf Coasts of the United States. The imagery data set is a rich observational record of storm impacts and coastal change, but the imagery requires labeling to render that information accessible. We created an online interface that served labelers a stream of images and a fixed set of questions. A total of 1,600 images were labeled by at least two or as many as seven coastal scientists. We used the resulting data set to investigate interrater agreement: the extent to which labelers labeled each image similarly. Interrater agreement scores, assessed with percent agreement and Krippendorff's alpha, are higher when the questions posed to labelers are relatively simple, when the labelers are provided with a user manual, and when images are smaller. Experiments in interrater agreement point toward the benefit of multiple labelers for understanding the uncertainty in labeling data for machine learning research.
Journal ArticleDOI
Big Data Seismology
TL;DR: In this article , the authors present a review of recent advances enabled by Big Data Seismology through the context of three major drivers: the development of new data-dense sensor systems, improvements in computing, and the developing of new types of techniques and algorithms.
References
More filters
Proceedings ArticleDOI
Deep Residual Learning for Image Recognition
TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article
ImageNet Classification with Deep Convolutional Neural Networks
TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article
Very Deep Convolutional Networks for Large-Scale Image Recognition
Karen Simonyan,Andrew Zisserman +1 more
TL;DR: In this paper, the authors investigated the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting and showed that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 layers.
Journal ArticleDOI
Deep learning
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.