scispace - formally typeset
Search or ask a question
Author

Takeo Ishibe

Bio: Takeo Ishibe is an academic researcher from University of Tokyo. The author has contributed to research in topics: Induced seismicity & Focal mechanism. The author has an hindex of 15, co-authored 36 publications receiving 663 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: Wessel and Smith as discussed by the authors used the GMT software for drawing the figures and benefited from constructive review comments by Costas E. Synolakis (University of Southern California, USA) and Yuichiro Tanioka (Hokkaido University, Japan).
Abstract: Teleseismic data were provided by the Incorporated Research Institutions for Seismology (http://www.iris.edu/wilber3/find_event). Tide gauge data can be found at the Intergovernmental Oceanographic Commission website (http://www.ioc-sealevelmonitoring.org/). DART records were provided by NOAA (http://nctr.pmel.noaa.gov/Dart/). Earthquake catalogs by the USGS National Earthquake Information Center (http://earthquake.usgs.gov/earthquakes/search/) and Global Instrumental Earthquake Catalogue (1900-2009) of International Seismological Centre Global Earthquake Model (http://www.globalquakemodel.org/what/seismic-hazard/instrumental-catalogue/) were used in this study. We used the GMT software for drawing the figures [Wessel and Smith, 1998]. This article benefited from constructive review comments by Costas E. Synolakis (University of Southern California, USA) and Yuichiro Tanioka (Hokkaido University, Japan) for which we are grateful. We acknowledge financial supports from the Japan Society for the Promotion of Science.

118 citations

Journal ArticleDOI
TL;DR: In this article, the authors applied a combination of qualitative physical modeling and wavelet analyses of the tsunami as well as numerical modeling to propose a source model, and constrained the tsunami source dimension and initial amplitude in the ranges of 1.5-2.5km and 100-150m, respectively.

92 citations

Journal ArticleDOI
TL;DR: In this article, the authors used the catalog maintained by the Japan Meteorological Agency (JMA) and also available information on seismic stations that report to JMA for computing the completeness magnitude of earthquakes in Japan.
Abstract: A reliable estimate of completeness magnitude, M c , above which all earthquakes are considered to be detected by a seismic network, is vital for seismicity-related studies. We show a comprehensive analysis of M c in Japan. We use the catalog maintained by the Japan Meteorological Agency (JMA) and also available information on seismic stations that report to JMA. For computing M c , we adopt a commonly used method based on the Gutenberg–Richter frequency-magnitude law. Presently, M c =1.0 might be typical in the mainland, but to have a complete catalog, one needs to use earthquakes with magnitudes of 1.9 or larger. Comparison with the Southern California Seismic Network (SCSN) suggests that the recent event detectability in the mainland generally shows similar completeness levels to that in the authoritative region of SCSN. We argue that the current M c of Japan is due to the success of network modernization over time. Particularly, we show that the spatiotemporal change of M c closely matches the addition of the Hi-net borehole stations to enhancing seismic-station density; it started in October 1997 in southwestern Japan, continuing to northeastern Japan until 2002. As suggested from this matching, we confirm that M c inversely correlates with station density. Further, we find that irrespective of the network change after 1997, this correlation is unchanged in time, demonstrating that the influence on M c from factors beyond station density does not vary in time. Contrary to Alaska and California (Wiemer and Wyss, 2000), our results do not attribute such factors simply to anthropogenic noise. Because this is due to the borehole stations that reduce ambient noise, we conclude that in Japan the anthropogenic noise has an insignificant effect on M c .

76 citations

Journal ArticleDOI
TL;DR: In this paper, the Coulomb Failure Function (ΔCFF) was used to forecast an increase in seismicity in and around Tokyo metropolis after the 2011 off the Pacific coast of Tohoku Earthquake.
Abstract: Static changes in the Coulomb Failure Function (ΔCFF) forecast an increase in seismicity in and around the Tokyo metropolis after the 2011 off the Pacific coast of Tohoku Earthquake (magnitude 9.0). Among the 30,694 previous events in this region with various depth and focal mechanism, almost 19,000 indicate a significant increase of the ΔCFF, while less than 6,000 indicate a significant decrease. An increase in seismicity is predicted in southwestern Ibaraki and northern Chiba prefectures where intermediate-depth earthquakes occur, and in the shallow crust of the Izu and Hakone regions. A comparison of seismicity before and after the 2011 event reveals that the seismicity in the above regions indeed increased as predicted from the ΔCFF.

52 citations

Journal ArticleDOI
TL;DR: In this article, nine sandy layers in 15 geo-slices collected at distances ranging from 140 to 260m from the coast in a lowland back marsh protected from the sea by a high sandy ridge were correlated with historical tsunamis and storms.

50 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A multiple time window inversion of 53 high-sampling tsunami waveforms on ocean bottom pressure, Global Positioning System, coastal wave, and tide gauges shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake as mentioned in this paper.
Abstract: A multiple time window inversion of 53 high‐sampling tsunami waveforms on ocean‐bottom pressure, Global Positioning System, coastal wave, and tide gauges shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake. The fault rupture started near the hypocenter and propagated into both deep and shallow parts of the plate interface. A very large slip (approximately 25 m) in the deep part off Miyagi at a location similar to the previous 869 Jogan earthquake model was responsible for the initial rise of tsunami waveforms and the recorded tsunami inundation in the Sendai and Ishinomaki plains. A huge slip, up to 69 m, occurred in the shallow part near the trench axis 3 min after the rupture initiation. This delayed shallow rupture extended for 400 km with more than a 10‐m slip, at a location similar to the 1896 Sanriku tsunami earthquake, and was responsible for the peak amplitudes of the tsunami waveforms and the maximum tsunami heights measured on the northern Sanriku coast, 100 km north of the largest slip. The average slip on the entire fault was 9.5 m, and the total seismic moment was 4.2×1022 N·m ( M w 9.0). The large horizontal displacement of seafloor slope was responsible for 20%–40% of tsunami amplitudes. The 2011 deep slip alone could reproduce the distribution of the 869 tsunami deposits, indicating that the 869 Jogan earthquake source could be similar to the 2011 earthquake, at least in the deep‐plate interface. The large tsunami at the Fukushima nuclear power station is due to either the combination of a deep and shallow slip or a triggering of a shallow slip by a deep slip, which was not accounted for in the previous tsunami‐hazard assessments. Online Material: Table of estimated slip for all subfaults at 0.5 min invervals.

440 citations

Journal ArticleDOI
TL;DR: One of the first case studies demonstrating the use of distributed acoustic sensing deployed on regional unlit fiber-optic telecommunication infrastructure (dark fiber) for broadband seismic monitoring of both near-surface soil properties and earthquake seismology is presented.
Abstract: We present one of the first case studies demonstrating the use of distributed acoustic sensing deployed on regional unlit fiber-optic telecommunication infrastructure (dark fiber) for broadband seismic monitoring of both near-surface soil properties and earthquake seismology. We recorded 7 months of passive seismic data on a 27 km section of dark fiber stretching from West Sacramento, CA to Woodland, CA, densely sampled at 2 m spacing. This dataset was processed to extract surface wave velocity information using ambient noise interferometry techniques; the resulting VS profiles were used to map both shallow structural profiles and groundwater depth, thus demonstrating that basin-scale variations in hydrological state could be resolved using this technique. The same array was utilized for detection of regional and teleseismic earthquakes and evaluated for long period response using records from the M8.1 Chiapas, Mexico 2017, Sep 8th event. The combination of these two sets of observations conclusively demonstrates that regionally extensive fiber-optic networks can effectively be utilized for a host of geoscience observation tasks at a combination of scale and resolution previously inaccessible.

257 citations

DOI
01 Apr 2012
TL;DR: In this article, the authors describe peer-reviewed techniques to estimate and map the lowest magnitude at which all the earthquakes in a space-time volume are detected, and provide examples with real and synthetic earthquake catalogs to illustrate features of various methods and give the pros and cons of each method.
Abstract: Assessing the magnitude of completeness Mc of instrumental earthquake catalogs is an essential and compulsory step for any seismicity analysis. Mc is defined as the lowest magnitude at which all the earthquakes in a space-time volume are detected. A correct estimate of Mc is crucial since a value too high leads to under-sampling, by discarding usable data, while a value too low leads to erroneous seismicity parameter values and thus to a biased analysis, by using incomplete data. In this article, we describe peer-reviewed techniques to estimate and map Mc. We provide examples with real and synthetic earthquake catalogs to illustrate features of the various methods and give the pros and cons of each method. With this article at hand, the reader will get an overview of approaches to assess Mc, understand why Mc evaluation is essential and an a non-trivial task, and hopefully be able to select the most appropriate Mc method to include in his seismicity studies.

213 citations

Journal Article
TL;DR: In this article, an elastic half-space model was used to estimate the static stress changes generated by damaging (magnitude M≥5) earthquakes in southern California over the past 26 years, and to investigate the influence of these changes on subsequent earthquake activity.

172 citations

Journal ArticleDOI
TL;DR: In this article, the present status of research and understanding regarding the dynamics and the statistical properties of earthquakes is reviewed, mainly from a statistical physical view point, focusing on the physics of friction and fracture, which provide a microscopic basis for our understanding of an earthquake instability, and on the statistical physical modelling of earthquakes, which provides macroscopic aspects of such phenomena.
Abstract: The present status of research and understanding regarding the dynamics and the statistical properties of earthquakes is reviewed, mainly from a statistical physical view point. Emphasis is put both on the physics of friction and fracture, which provides a microscopic basis for our understanding of an earthquake instability, and on the statistical physical modelling of earthquakes, which provides macroscopic aspects of such phenomena. Recent numerical results from several representative models are reviewed, with attention to both their critical and their characteristic properties. Some of the relevant notions and related issues are highlighted, including the origin of power laws often observed in statistical properties of earthquakes, apparently contrasting features of characteristic earthquakes or asperities, the nature of precursory phenomena and nucleation processes, and the origin of slow earthquakes, etc.

162 citations