scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Computer Programs in Seismology: An Evolving Tool for Instruction and Research

01 Nov 2013-Seismological Research Letters (GeoScienceWorld)-Vol. 84, Iss: 6, pp 1081-1088
TL;DR: Students must adapt to research‐quality software and use it to develop the deep insight into seismic‐wave excitation and propagation before they can confidently apply the concepts to more sophisticated earth and earthquake models.
Abstract: Earthquake seismology, like most areas of geophysics, is a fascinating mix of theory, computation, and observation The past 50–60 years of earthquake seismology can be described as a synergistic interaction between the expanding quantity and improving quality of seismic data and important advances in practical wave‐propagation physics and computation An important impetus for many of these developments was the need to monitor and characterize nuclear explosions in the atmosphere, oceans, and underground, which stimulated a major investment in seismology during the 1960s that, combined with the plate tectonics revolution, initiated large growth in the field Substantial effort was invested to produce standard analysis packages and software tools for the processing and analysis of seismic observations As a result, high‐quality seismic data are readily accessible and sufficient computational power to routinely analyze these observations is available to almost everyone Turn‐key seismic networks are available from manufacturers who provide sensors, data acquisition and transmission, event location, and archival functions in a manner that requires training, but not a detailed understanding of the of the hardware or software packages As we advance toward more sophisticated analysis of ever‐larger data sets, including the effects of 3D structure, the practical and theoretical challenges facing students are substantial Students must adapt to research‐quality software and use it to develop the deep insight into seismic‐wave excitation and propagation before they can confidently apply the concepts to more sophisticated earth and earthquake models Although beginning students have the skills to use some tools such as word processors and have some scripting experience, many are less comfortable extending old or developing new analysis tools or even using existing codes to tackle realistic 1D problems The challenge facing educators is to enable students to quickly become proficient in basic seismic data processing so that they can move on to research …
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors used data from more than 2000 seismic stations from multiple networks arrayed throughout China (CEArray, China Array, NECESS, PASSCAL, GSN) and surrounding regions (Korean Seismic Network, F-Net, KNET) and produced isotropic Rayleigh wave group and phase speed maps with uncertainty estimates from 8 to 50s period across the entire region of study, and extend them to 70s period where earthquake tomography is performed.
Abstract: Using data from more than 2000 seismic stations from multiple networks arrayed throughout China (CEArray, China Array, NECESS, PASSCAL, GSN) and surrounding regions (Korean Seismic Network, F-Net, KNET), we perform ambient noise Rayleigh wave tomography across the entire region and earthquake tomography across parts of South China and Northeast China. We produce isotropic Rayleigh wave group and phase speed maps with uncertainty estimates from 8 to 50s period across the entire region of study, and extend them to 70s period where earthquake tomography is performed. Maps of azimuthal anisotropy are estimated simultaneously to minimize anisotropic bias in the isotropic maps, but are not discussed here. The 3D model is produced using a Bayesian Monte Carlo formalism covering all of China, extending eastwards through the Korean Peninsula, into the marginal seas, to Japan. We define the final model as the mean and standard deviation of the posterior distribution at each location on a 0.5° × 0.5° grid from the surface to 150km depth. Surface wave dispersion data do not strongly constrain internal interfaces, but shear wave speeds between the discontinuities in the crystalline crust and uppermost mantle are well determined. We design the resulting model as a reference model, which is intended to be useful to other researchers as a starting model, to predict seismic wave fields and observables and to predict other types of data (e.g. topography, gravity). The model and the data on which it is based are available for download. In addition, the model displays a great variety and considerable richness of geological and tectonic features in the crust and in the uppermost mantle deserving of further focus and continued interpretation.

247 citations

Journal ArticleDOI
01 Nov 2018-Nature
TL;DR: Seismic images of Earth's crust and uppermost mantle around the Mariana trench derived from Rayleigh-wave analysis of broadband ocean-bottom seismic data show widespread serpentinization, suggesting that much more water is subducted than previously thought.
Abstract: The water cycle at subduction zones remains poorly understood, although subduction is the only mechanism for water transport deep into Earth. Previous estimates of water flux1–3 exhibit large variations in the amount of water that is subducted deeper than 100 kilometres. The main source of uncertainty in these calculations is the initial water content of the subducting uppermost mantle. Previous active-source seismic studies suggest that the subducting slab may be pervasively hydrated in the plate-bending region near the oceanic trench4–7. However, these studies do not constrain the depth extent of hydration and most investigate young incoming plates, leaving subduction-zone water budgets for old subducting plates uncertain. Here we present seismic images of the crust and uppermost mantle around the central Mariana trench derived from Rayleigh-wave analysis of broadband ocean-bottom seismic data. These images show that the low mantle velocities that result from mantle hydration extend roughly 24 kilometres beneath the Moho discontinuity. Combined with estimates of subducting crustal water, these results indicate that at least 4.3 times more water subducts than previously calculated for this region3. If other old, cold subducting slabs contain correspondingly thick layers of hydrous mantle, as suggested by the similarity of incoming plate faulting across old, cold subducting slabs, then estimates of the global water flux into the mantle at depths greater than 100 kilometres must be increased by a factor of about three compared to previous estimates3. Because a long-term net influx of water to the deep interior of Earth is inconsistent with the geological record8, estimates of water expelled at volcanic arcs and backarc basins probably also need to be revised upwards9. Seismic images of Earth’s crust and uppermost mantle around the Mariana trench show widespread serpentinization, suggesting that much more water is subducted than previously thought.

136 citations

Journal ArticleDOI
TL;DR: In this paper, the authors recover daily measurements of seismic velocity (dv/v) in the San Gabriel Valley, California, from cross correlation of the ambient seismic field, which reproduces the groundwater level changes that are marked by the multi-year depletions and rapid recharges typical of California's cycles of droughts and floods.
Abstract: Aquifers are vital groundwater reservoirs for residential, agricultural, and industrial activities worldwide. Tracking their state with high temporal and spatial resolution is critical for water resource management at the regional scale yet is rarely achieved from a single dataset. Here, we show that variations in groundwater levels can be mapped using perturbations in seismic velocity (dv/v). We recover daily measurements of dv/v in the San Gabriel Valley, California, from cross correlation of the ambient seismic field. dv/v reproduces the groundwater level changes that are marked by the multi-year depletions and rapid recharges typical of California’s cycles of droughts and floods. dv/v correlates spatially with vertical surface displacements and deformation measured with GPS. Our results successfully predict the volume of water lost in the San Gabriel Valley during the 2012-2016 drought and thus provide a new approach to monitor groundwater storage.

112 citations


Cites background from "Computer Programs in Seismology: An..."

  • ...0-Hz frequency range, which is greatly sensitive to the upper 500 m of the basin (see supporting information for surface wave sensitivity kernels; Herrmann, 2013)....

    [...]

  • ...We measure dv/v in the 0.5- to 2.0-Hz frequency range, which is greatly sensitive to the upper 500 m of the basin (see supporting information for surface wave sensitivity kernels; Herrmann, 2013)....

    [...]

Journal ArticleDOI
TL;DR: The InSight lander will deliver geophysical instruments to Mars in 2018, including seismometers installed directly on the surface (Seismic Experiment for Interior Structure, SEIS) as discussed by the authors.
Abstract: The InSight lander will deliver geophysical instruments to Mars in 2018, including seismometers installed directly on the surface (Seismic Experiment for Interior Structure, SEIS). Routine operations will be split into two services, the Mars Structure Service (MSS) and Marsquake Service (MQS), which will be responsible, respectively, for defining the structure models and seismicity catalogs from the mission. The MSS will deliver a series of products before the landing, during the operations, and finally to the Planetary Data System (PDS) archive. Prior to the mission, we assembled a suite of a priori models of Mars, based on estimates of bulk composition and thermal profiles. Initial models during the mission will rely on modeling surface waves and impact-generated body waves independent of prior knowledge of structure. Later modeling will include simultaneous inversion of seismic observations for source and structural parameters. We use Bayesian inversion techniques to obtain robust probability distribution functions of interior structure parameters. Shallow structure will be characterized using the hammering of the heatflow probe mole, as well as measurements of surface wave ellipticity. Crustal scale structure will be constrained by measurements of receiver function and broadband Rayleigh wave ellipticity measurements. Core interacting body wave phases should be observable above modeled martian noise levels, allowing us to constrain deep structure. Normal modes of Mars should also be observable and can be used to estimate the globally averaged 1D structure, while combination with results from the InSight radio science mission and orbital observations will allow for constraint of deeper structure.

98 citations


Cites methods from "Computer Programs in Seismology: An..."

  • ...Seismograms in this model are calculated using a cartesian geometry mode summation approach (Herrmann, 2013) assuming the hammer strokes are sampled at every 1 mm of depth down to the full 5 m of penetration....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors calculate moment tensors and stress drop values for eight recent induced earthquakes in the Western Canadian Sedimentary Basin with magnitudes between 3.2 and 4.4, as well as a nearby magnitude 5.3 event that is interpreted as a natural earthquake.
Abstract: Earthquake source mechanisms and spectra can provide important clues to aid in discriminating between natural and induced events. In this study, we calculate moment tensors and stress drop values for eight recent induced earthquakes in the Western Canadian Sedimentary Basin with magnitudes between 3.2 and 4.4, as well as a nearby magnitude 5.3 event that is interpreted as a natural earthquake. We calculate full moment tensor solutions by performing a waveform-fitting procedure based on a 1-D transversely isotropic velocity model. In addition to a dominant double-couple (DC) signature that is common to nearly all events, most induced events exhibit significant non-double-couple components. A parameter sensitivity analysis indicates that spurious non-DC components are negligible if the signal to noise ratio (SNR) exceeds 10 and if the 1-D model differs from the true velocity structure by less than 5%. Estimated focal depths of induced events are significantly shallower than the typical range of focal depths for intraplate earthquakes in the Canadian Shield. Stress drops of the eight induced events were estimated using a generalized spectral-fitting method and fall within the typical range of 2 to 90 MPa for tectonic earthquakes. Elastic moduli changes due to the brittle damage production at the source, presence of multiple intersecting fractures, dilatant jogs created at the overlapping areas of multiple fractures, or non-planar pre-existing faults may explain the non-DC components for induced events.

94 citations


Cites methods from "Computer Programs in Seismology: An..."

  • ...In this study, we use full waveforms to perform moment-tensor inversion, based on algorithms in Computer Programs in Seismology (CPS) [Herrmann, 2013]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, a new empirical traveltime curves for the major seismic phases have been derived from the catalogues of the International Seismological Centre by relocating events by using P readings, depth phases and the iasp91 traveltimes, and then re-associating phase picks.
Abstract: SUMMARY New empirical traveltime curves for the major seismic phases have been derived from the catalogues of the International Seismological Centre by relocating events by using P readings, depth phases and the iasp91 traveltimes, and then re-associating phase picks. A smoothed set of traveltime tables is extracted by a robust procedure which gives estimates of the variance of the traveltimes for each phase branch. This set of smoothed empirical times is then used to construct a range of radial velocity profiles, which are assessed against a number of different measures of the level of fit between the empirical times and the predictions of the models. These measures are constructed from weighted sums of L2 misfits for individual phases. The weights are chosen to provide a measure of the probable reliability of the picks for the different phases. A preferred model, ak135, is proposed which gives a significantly better fit to a broad range of phases than is provided by the iasp91 and sp6 models. The differences in velocity between ak135 and these models are generally quite small except at the boundary of the inner core, where reduced velocity gradients are needed to achieve satisfactory performance for PKP differential time data. The potential resolution of velocity structure has been assessed with the aid of a non-linear search procedure in which 5000 models have been generated in bounds about ak135. Msfit calculations are performed for each of the phases in the empirical traveltime sets, and the models are then sorted using different overall measures of misfit. The best 100 models for each criterion are displayed in a model density plot which indicates the consistency of the different models. The interaction of information from different phases can be analysed by comparing the different misfit measures. Structure in the mantle is well resolved except at the base, and ak135 provides a good representation of core velocities.

2,925 citations

Book ChapterDOI
TL;DR: A brief overview of the fundamental features of SAC2000 is presented and some recent enhancements that make it a much more powerful tool for seismic analysis are discussed.
Abstract: SAC2000 or SAC, as many users refer to it, is a primary signal processing and analysis tool for a large portion of the international seismological research and engineering communities including academic, government, and business institutions. SAC has extensive, well-documented, well-tested, and well-maintained data processing and analysis capabilities, a macro programming language which allows users to develop new analysis techniques and customized processing programs, and the ability to do both batch and interactive processing. SAC's Strengths also include the ability to process a diverse range of data types. Its extensive usage (> 400 institutions worldwide) has made it much easier for researchers to develop collaborative research projects. SAC is relatively easy to use and is available on a variety of hardware platforms. Part of its popularity is due to its user oriented development philosophy, which has led to consistent, backward compatible development, guided by users input and needs. We present a brief overview of the fundamental features of SAC2000 and discuss some recent enhancements that make it a much more powerful tool for seismic analysis. These new features range from I/O enhancements to significant new processing capabilities and include a number of features that significantly increase user efficiency and productivity. Documentation is also a strength of SAC with detailed manuals available through SAC’s help facility and the world wide web at http://www.llnl.gov/sac. Future plans for SAC involve selected upgrades and re-engineering with object oriented development techniques to provide more flexible and efficient tools for the analysis of large databases or distributed data sets.

423 citations


"Computer Programs in Seismology: An..." refers methods in this paper

  • ...The plot was made using gsac and the SAC files (Goldstein et al., 2003) created by f96tosac....

    [...]

Book
01 Jan 1978
TL;DR: These notes provide guidelines for internal program documentation and style for student programmers, geared mainly for a block structured language such as Pascal or C, most of the points discussed are applicable to any programming language.

409 citations


"Computer Programs in Seismology: An..." refers background in this paper

  • ...The UNIX experience was beneficial because of the underlying philosophy of creating compatible, flexible tools to perform complicated tasks (Kernighan and Plauger, 1974)....

    [...]