scispace - formally typeset
Search or ask a question
Author

Hannes Vasyura-Bathke

Bio: Hannes Vasyura-Bathke is an academic researcher from University of Potsdam. The author has contributed to research in topics: Slip (materials science) & Bayesian probability. The author has an hindex of 8, co-authored 17 publications receiving 175 citations. Previous affiliations of Hannes Vasyura-Bathke include King Abdullah University of Science and Technology.

Papers
More filters
Journal ArticleDOI
TL;DR: Pyrocko-GF defines an extensible GF storage format suitable for a wide range of GF types, especially handling elasticity and wave propagation problems, which is supported through an online platform that provides many pre-calculated GF stores for local, regional, and global studies.
Abstract: . The finite physical source problem is usually studied with the concept of volume and time integrals over Green's functions (GFs), representing delta-impulse solutions to the governing partial differential field equations. In seismology, the use of realistic Earth models requires the calculation of numerical or synthetic GFs, as analytical solutions are rarely available. The computation of such synthetic GFs is computationally and operationally demanding. As a consequence, the on-the-fly recalculation of synthetic GFs in each iteration of an optimisation is time-consuming and impractical. Therefore, the pre-calculation and efficient storage of synthetic GFs on a dense grid of source to receiver combinations enables the efficient lookup and utilisation of GFs in time-critical scenarios. We present a Python-based framework and toolkit – Pyrocko-GF – that enables the pre-calculation of synthetic GF stores, which are independent of their numerical calculation method and GF transfer function. The framework aids in the creation of such GF stores by interfacing a suite of established numerical forward modelling codes in seismology (computational back ends). So far, interfaces to back ends for layered Earth model cases have been provided; however, the architecture of Pyrocko-GF is designed to cover back ends for other geometries (e.g. full 3-D heterogeneous media) and other physical quantities (e.g. gravity, pressure, tilt). Therefore, Pyrocko-GF defines an extensible GF storage format suitable for a wide range of GF types, especially handling elasticity and wave propagation problems. The framework assists with visualisations, quality control, and the exchange of GF stores, which is supported through an online platform that provides many pre-calculated GF stores for local, regional, and global studies. The Pyrocko-GF toolkit comes with a well-documented application programming interface (API) for the Python programming language to efficiently facilitate forward modelling of geophysical processes, e.g. synthetic waveforms or static displacements for a wide range of source models.

49 citations

Journal ArticleDOI
TL;DR: In this article, the authors thank Jiři Vackař, Romain Jolivet, one anynomous reviewer, and the editors for their comments that helped to improve the quality of this article.
Abstract: The authors thank Jiři Vackař, Romain Jolivet, one anynomous reviewer, and the editors for their comments that helped to improve the quality of this article. This research was supported by King Abdullah University of Science and Technology (KAUST), under Award Numbers BAS/1/1353-01-01 and BAS/1/1339-01-1. H. V.-B. was partially supported by Geo.X, the Research Network for Geosciences in Berlin and Potsdam under the Project Number SO_087_GeoX. Henriette Sudhaus, Andreas Steinberg, and Marius Paul Isken acknowledge founding by the German Research Foundation (DFG) through an Emmy-Noether Young Researcher Grant Number 276464525. Hannes Vasyura-Bathke owes the most gratitude to his belove wife Olha for her tireless support and tolerance during many evenings and nights spent writing the code and this article.

35 citations

Journal ArticleDOI
TL;DR: In this paper, the authors use both analog subsidence experiments and boundary element modeling to study the three-dimensional geometry and kinematics of caldera subsidence processes, evolving from an initial downsag to a later collapse stage.

18 citations


Cited by
More filters
Journal ArticleDOI
22 Mar 2019-Science
TL;DR: Solid Earth geoscience is a field that has very large set of observations, which are ideal for analysis with machine-learning methods, and how these methods can be applied to solid Earth datasets is reviewed.
Abstract: BACKGROUND The solid Earth, oceans, and atmosphere together form a complex interacting geosystem. Processes relevant to understanding Earth’s geosystem behavior range in spatial scale from the atomic to the planetary, and in temporal scale from milliseconds to billions of years. Physical, chemical, and biological processes interact and have substantial influence on this complex geosystem, and humans interact with it in ways that are increasingly consequential to the future of both the natural world and civilization as the finiteness of Earth becomes increasingly apparent and limits on available energy, mineral resources, and fresh water increasingly affect the human condition. Earth is subject to a variety of geohazards that are poorly understood, yet increasingly impactful as our exposure grows through increasing urbanization, particularly in hazard-prone areas. We have a fundamental need to develop the best possible predictive understanding of how the geosystem works, and that understanding must be informed by both the present and the deep past. This understanding will come through the analysis of increasingly large geo-datasets and from computationally intensive simulations, often connected through inverse problems. Geoscientists are faced with the challenge of extracting as much useful information as possible and gaining new insights from these data, simulations, and the interplay between the two. Techniques from the rapidly evolving field of machine learning (ML) will play a key role in this effort. ADVANCES The confluence of ultrafast computers with large memory, rapid progress in ML algorithms, and the ready availability of large datasets place geoscience at the threshold of dramatic progress. We anticipate that this progress will come from the application of ML across three categories of research effort: (i) automation to perform a complex prediction task that cannot easily be described by a set of explicit commands; (ii) modeling and inverse problems to create a representation that approximates numerical simulations or captures relationships; and (iii) discovery to reveal new and often unanticipated patterns, structures, or relationships. Examples of automation include geologic mapping using remote-sensing data, characterizing the topology of fracture systems to model subsurface transport, and classifying volcanic ash particles to infer eruptive mechanism. Examples of modeling include approximating the viscoelastic response for complex rheology, determining wave speed models directly from tomographic data, and classifying diverse seismic events. Examples of discovery include predicting laboratory slip events using observations of acoustic emissions, detecting weak earthquake signals using similarity search, and determining the connectivity of subsurface reservoirs using groundwater tracer observations. OUTLOOK The use of ML in solid Earth geosciences is growing rapidly, but is still in its early stages and making uneven progress. Much remains to be done with existing datasets from long-standing data sources, which in many cases are largely unexplored. Newer, unconventional data sources such as light detection and ranging (LiDAR), fiber-optic sensing, and crowd-sourced measurements may demand new approaches through both the volume and the character of information that they present. Practical steps could accelerate and broaden the use of ML in the geosciences. Wider adoption of open-science principles such as open source code, open data, and open access will better position the solid Earth community to take advantage of rapid developments in ML and artificial intelligence. Benchmark datasets and challenge problems have played an important role in driving progress in artificial intelligence research by enabling rigorous performance comparison and could play a similar role in the geosciences. Testing on high-quality datasets produces better models, and benchmark datasets make these data widely available to the research community. They also help recruit expertise from allied disciplines. Close collaboration between geoscientists and ML researchers will aid in making quick progress in ML geoscience applications. Extracting maximum value from geoscientific data will require new approaches for combining data-driven methods, physical modeling, and algorithms capable of learning with limited, weak, or biased labels. Funding opportunities that target the intersection of these disciplines, as well as a greater component of data science and ML education in the geosciences, could help bring this effort to fruition. The list of author affiliations is available in the full article online.

547 citations

01 Jan 2001
TL;DR: The probability of any event is the ratio between the value at which an expectation depending on the happening of the event ought to be computed, and the value of the thing expected upon it’s 2 happening.
Abstract: Problem Given the number of times in which an unknown event has happened and failed: Required the chance that the probability of its happening in a single trial lies somewhere between any two degrees of probability that can be named. SECTION 1 Definition 1. Several events are inconsistent, when if one of them happens, none of the rest can. 2. Two events are contrary when one, or other of them must; and both together cannot happen. 3. An event is said to fail, when it cannot happen; or, which comes to the same thing, when its contrary has happened. 4. An event is said to be determined when it has either happened or failed. 5. The probability of any event is the ratio between the value at which an expectation depending on the happening of the event ought to be computed, and the value of the thing expected upon it’s 2 happening.

368 citations

Journal ArticleDOI
TL;DR: A high-quality, large-scale, and global data set of local earthquake and non-earthquake signals recorded by seismic instruments, which contains two categories: local earthquake waveforms and seismic noise waveforms that are free of earthquake signals.
Abstract: Seismology is a data rich and data-driven science. Application of machine learning for gaining new insights from seismic data is a rapidly evolving sub-field of seismology. The availability of a large amount of seismic data and computational resources, together with the development of advanced techniques can foster more robust models and algorithms to process and analyze seismic signals. Known examples or labeled data sets, are the essential requisite for building supervised models. Seismology has labeled data, but the reliability of those labels is highly variable, and the lack of high-quality labeled data sets to serve as ground truth as well as the lack of standard benchmarks are obstacles to more rapid progress. In this paper we present a high-quality, large-scale, and global data set of local earthquake and non-earthquake signals recorded by seismic instruments. The data set in its current state contains two categories: (1) local earthquake waveforms (recorded at “local” distances within 350 km of earthquakes) and (2) seismic noise waveforms that are free of earthquake signals. Together these data comprise ~1.2 million time series or more than 19,000 hours of seismic signal recordings. Constructing such a large-scale database with reliable labels is a challenging task. Here, we present the properties of the data set, describe the data collection, quality control procedures, and processing steps we undertook to insure accurate labeling, and discuss potential applications. We hope that the scale and accuracy of STEAD presents new and unparalleled opportunities to researchers in the seismological community and beyond.

161 citations

Journal ArticleDOI
TL;DR: In this article, the authors analyzed regional and global seismic and deformation data to provide a one-year-long detailed picture of a deep, rare magmatic process and identified about 7,000 volcano-tectonic earthquakes and 407 very-long-period seismic signals.
Abstract: The dynamics of magma deep in the Earth’s crust are difficult to capture by geophysical monitoring. Since May 2018, a seismically quiet area offshore of Mayotte in the western Indian Ocean has been affected by complex seismic activity, including long-duration, very-long-period signals detected globally. Global Navigation Satellite System stations on Mayotte have also recorded a large surface deflation offshore. Here we analyse regional and global seismic and deformation data to provide a one-year-long detailed picture of a deep, rare magmatic process. We identify about 7,000 volcano-tectonic earthquakes and 407 very-long-period seismic signals. Early earthquakes migrated upward in response to a magmatic dyke propagating from Moho depth to the surface, whereas later events marked the progressive failure of the roof of a magma reservoir, triggering its resonance. An analysis of the very-long-period seismicity and deformation suggests that at least 1.3 km3 of magma drained from a reservoir of 10 to 15 km diameter at 25 to 35 km depth. We demonstrate that such deep offshore magmatic activity can be captured without any on-site monitoring. Recent seismicity near Mayotte in the Indian Ocean is due to dyke propagation from and drainage of a 25–35 km deep magma reservoir, according to an analysis of earthquake and deformation data.

105 citations

Journal ArticleDOI
06 Dec 2019-Science
TL;DR: A model of time-evolving reservoir depressurization is developed to jointly explain lava lake withdrawal rate and the rate and spatial pattern of ground subsidence obtained from radar satellites and a dense local monitoring network.
Abstract: Caldera-forming eruptions are among Earth's most hazardous natural phenomena, yet the architecture of subcaldera magma reservoirs and the conditions that trigger collapse are poorly understood. Observations from the formation of a 0.8-cubic kilometer basaltic caldera at Kīlauea Volcano in 2018 included the draining of an active lava lake, which provided a window into pressure decrease in the reservoir. We show that failure began after <4% of magma was withdrawn from a shallow reservoir beneath the volcano's summit, reducing its internal pressure by ~17 megapascals. Several cubic kilometers of magma were stored in the reservoir, and only a fraction was withdrawn before the end of the eruption. Thus, caldera formation may begin after withdrawal of only small amounts of magma and may end before source reservoirs are completely evacuated.

99 citations