scispace - formally typeset
Search or ask a question
Institution

California Geological Survey

About: California Geological Survey is a based out in . It is known for research contribution in the topics: Fault (geology) & Seismic hazard. The organization has 85 authors who have published 133 publications receiving 4462 citations. The organization is also known as: California Division of Mines and Geology.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors present the time independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California.
Abstract: The 2014 Working Group on California Earthquake Probabilities (WGCEP14) present the time‐independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time‐averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation and include multifault ruptures, both limitations of UCERF2. The rates of all earthquakes are solved for simultaneously and from a broader range of data, using a system‐level inversion that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (e.g., magnitude–frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1440 alternative logic‐tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M w≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip‐rate constraints on faults previously excluded due to lack of geologic data. The grand inversion constitutes a system‐level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg–Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (e.g., constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of M 6.5–7 earthquake rates and also includes types of multifault ruptures seen in nature. Although UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site‐specific investigation. Supporting products may be of general interest, and we list key assumptions and avenues for future model improvements.

448 citations

Journal ArticleDOI
TL;DR: In this article, the authors presented the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2), which comprises a timeindependent (Poisson-process) earthquake rate model, developed jointly with the National Seismic Hazard Mapping Program and a time-dependent earthquake-probability model, based on recent earthquake rates and stress-renewal statistics conditioned on the date of last event.
Abstract: The 2007 Working Group on California Earthquake Probabilities (WGCEP, 2007) presents the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2). This model comprises a time-independent (Poisson-process) earthquake rate model, developed jointly with the National Seismic Hazard Mapping Program and a time-dependent earthquake-probability model, based on recent earthquake rates and stress-renewal statistics conditioned on the date of last event. The models were developed from updated statewide earthquake catalogs and fault deformation databases using a uniform methodology across all regions and implemented in the modular, extensible Open Seismic Hazard Analysis framework. The rate model satisfies integrating measures of deformation across the plate-boundary zone and is consistent with historical seismicity data. An overprediction of earthquake rates found at intermediate magnitudes (6.5≤ M ≤7.0) in previous models has been reduced to within the 95% confidence bounds of the historical earthquake catalog. A logic tree with 480 branches represents the epistemic uncertainties of the full time-dependent model. The mean UCERF 2 time-dependent probability of one or more M ≥6.7 earthquakes in the California region during the next 30 yr is 99.7%; this probability decreases to 46% for M ≥7.5 and to 4.5% for M ≥8.0. These probabilities do not include the Cascadia subduction zone, largely north of California, for which the estimated 30 yr, M ≥8.0 time-dependent probability is 10%. The M ≥6.7 probabilities on major strike-slip faults are consistent with the WGCEP (2003) study in the San Francisco Bay Area and the WGCEP (1995) study in southern California, except for significantly lower estimates along the San Jacinto and Elsinore faults, owing to provisions for larger multisegment ruptures. Important model limitations are discussed.

357 citations

Journal ArticleDOI
13 Oct 2005-Nature
TL;DR: The 2004 Parkfield earthquake, with its lack of obvious precursors, demonstrates that reliable short-term earthquake prediction still is not achievable and the next generation of models that can provide better predictions of the strength and location of damaging ground shaking should be developed.
Abstract: Obtaining high-quality measurements close to a large earthquake is not easy: one has to be in the right place at the right time with the right instruments. Such a convergence happened, for the first time, when the 28 September 2004 Parkfield, California, earthquake occurred on the San Andreas fault in the middle of a dense network of instruments designed to record it. The resulting data reveal aspects of the earthquake process never before seen. Here we show what these data, when combined with data from earlier Parkfield earthquakes, tell us about earthquake physics and earthquake prediction. The 2004 Parkfield earthquake, with its lack of obvious precursors, demonstrates that reliable short-term earthquake prediction still is not achievable. To reduce the societal impact of earthquakes now, we should focus on developing the next generation of models that can provide better predictions of the strength and location of damaging ground shaking.

352 citations

Journal ArticleDOI
TL;DR: In this paper, the authors sorted the available shear-wave velocity data by geologic unit, generalized the geologic units, and prepared a map so that they could use the extent of the map units to transfer the velocity characteristics from the sites where they were measured to sites on the same or similar materials.
Abstract: Consideration of site conditions is a vital step in analyzing and predicting earthquake ground motion. The importance of amplification by soil conditions has long been recognized, but though many seismic-instrument sites have been characterized by their geologic conditions, there has been no consistent, simple classification applied to all sites. As classification of sites by shear-wave velocity has become more common, the need to go back and provide a simple uniform classification for all stations has become apparent. Within the Pacific Earthquake Engineering Research Center’s Next Generation Attenuation equation project, developers of attenuation equations recognized the need to consider site conditions and asked that the California Geological Survey provide site conditions information for all stations that have recorded earthquake ground motion in California. To provide these estimates, we sorted the available shear-wave velocity data by geologic unit, generalized the geologic units, and prepared a map so that we could use the extent of the map units to transfer the velocity characteristics from the sites where they were measured to sites on the same or similar materials. This new map is different from the California Geological Survey “preliminary site-conditions map of California” in that 19 geologically defined categories are used, rather than National Earthquake Hazards Reduction Program categories. Although this map does not yet cover all of California, when completed it may provide a basis for more precise consideration of site conditions in ground-motion calculations.

229 citations

Journal ArticleDOI
TL;DR: In this paper, the authors presented the time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3) using renewal models to represent elastic-rebound-implied probabilities.
Abstract: The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-in- dependent model published previously, renewal models are utilized to represent elastic- rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for unsegmented models. The new meth- odology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30 yr M ! 6:7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault-slip rates), with relaxation of segmentation and inclusion of multifault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 size events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative im- portance of logic-tree branches, vary throughout the region and depend on the evalu- ation metric of interest. For example, M ! 6:7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the approximate nature of the model and known limitations, means the applicability of UCERF3 should be evaluated on a case-by-case basis.

159 citations


Authors

Showing all 85 results

Network Information
Related Institutions (5)
GNS Science
3.8K papers, 122.3K citations

76% related

United States Geological Survey
51K papers, 2.4M citations

76% related

Institut de Physique du Globe de Paris
6.2K papers, 255.7K citations

75% related

Lamont–Doherty Earth Observatory
8K papers, 504.5K citations

74% related

Colorado School of Mines
20.6K papers, 602.7K citations

73% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202111
202016
20194
20183
201711
20164