scispace - formally typeset
Search or ask a question

Showing papers in "Natural Hazards and Earth System Sciences in 2014"


Journal ArticleDOI
TL;DR: In this paper, the authors contrast traditional views with broader perspectives that are emerging from an improved understanding of the climatic context of floods, and they come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management.
Abstract: Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of local, catchment-specific characteristics, such as meteorology, topography and geology. These traditional views have been beneficial, but they have a narrow framing. In this paper we contrast traditional views with broader perspectives that are emerging from an improved understanding of the climatic context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical approaches in flood estimation need to be complemented by the search for the causal mechanisms and dominant processes in the atmosphere, catchment and river system that leave their fingerprints on flood characteristics. (3) Natural climate variability leads to time-varying flood characteristics, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand the interactions between society and floods. (5) Given the global scale and societal importance, we call for the organization of an international multidisciplinary collaboration and data-sharing initiative to further understand the links between climate and flooding and to advance flood research.

273 citations


Journal ArticleDOI
TL;DR: In this article, the potential ecological risk and trend of soil heavy-metal pollution around a coal gangue dump in Jilin Province (Northeast China) were evaluated by inductively coupled plasma mass spectrometry (ICP-MS).
Abstract: . The aim of the present study is to evaluate the potential ecological risk and trend of soil heavy-metal pollution around a coal gangue dump in Jilin Province (Northeast China). The concentrations of Cd, Pb, Cu, Cr and Zn were monitored by inductively coupled plasma mass spectrometry (ICP-MS). The potential ecological risk index method developed by Hakanson (1980) was employed to assess the potential risk of heavy-metal pollution. The potential ecological risk in the order of ER(Cd) > ER(Pb) > ER(Cu) > ER(Cr) > ER(Zn) have been obtained, which showed that Cd was the most important factor leading to risk. Based on the Cd pollution history, the cumulative acceleration and cumulative rate of Cd were estimated, then the fixed number of years exceeding the standard prediction model was established, which was used to predict the pollution trend of Cd under the accelerated accumulation mode and the uniform mode. Pearson correlation analysis and correspondence analysis are employed to identify the sources of heavy metals and the relationship between sampling points and variables. These findings provided some useful insights for making appropriate management strategies to prevent or decrease heavy-metal pollution around a coal gangue dump in the Yangcaogou coal mine and other similar areas elsewhere.

210 citations


Journal ArticleDOI
TL;DR: This study of landslides in Lower Austria focuses on the model form uncertainty to assess the quality of a flexible statistical modelling technique, the generalized additive model (GAM), and the implications of spatially varying prediction uncertainties regarding the susceptibility map classes by taking into account the confidence intervals of model predictions.
Abstract: . Landslide susceptibility maps are helpful tools to identify areas potentially prone to future landslide occurrence. As more and more national and provincial authorities demand for these maps to be computed and implemented in spatial planning strategies, several aspects of the quality of the landslide susceptibility model and the resulting classified map are of high interest. In this study of landslides in Lower Austria, we focus on the model form uncertainty to assess the quality of a flexible statistical modelling technique, the generalized additive model (GAM). The study area (15 850 km2) is divided into 16 modelling domains based on lithology classes. A model representing the entire study area is constructed by combining these models. The performances of the models are assessed using repeated k-fold cross-validation with spatial and random subsampling. This reflects the variability of performance estimates arising from sampling variation. Measures of spatial transferability and thematic consistency are applied to empirically assess model quality. We also analyse and visualize the implications of spatially varying prediction uncertainties regarding the susceptibility map classes by taking into account the confidence intervals of model predictions. The 95% confidence limits fall within the same susceptibility class in 85% of the study area. Overall, this study contributes to advancing open communication and assessment of model quality related to statistical landslide susceptibility models.

171 citations


Journal ArticleDOI
TL;DR: In this article, the authors presented the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia, which produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources.
Abstract: . Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500–2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1–10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1–1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

155 citations


Journal ArticleDOI
Christian M. Grams1, Hanin Binder1, Stephan Pfahl1, Nicolas Piaget1, Heini Wernli1 
TL;DR: In this paper, the authors investigated the key atmospheric processes that caused the 2013 Danube and Elbe flood in central Europe and found that the continuous large-scale slantwise ascent in so-called "equatorward ascending" warm conveyor belts (WCBs) associated with these cyclones was the key process that caused a 4 day heavy precipitation period.
Abstract: . In June 2013, central Europe was hit by a century flood affecting the Danube and Elbe catchments after a 4 day period of heavy precipitation and causing severe human and economic loss. In this study model analysis and observational data are investigated to reveal the key atmospheric processes that caused the heavy precipitation event. The period preceding the flood was characterised by a weather regime associated with cool and unusual wet conditions resulting from repeated Rossby wave breaking (RWB). During the event a single RWB established a reversed baroclinicity in the low to mid-troposphere in central Europe with cool air trapped over the Alps and warmer air to the north. The upper-level cut-off resulting from the RWB instigated three consecutive cyclones in eastern Europe that unusually tracked westward during the days of heavy precipitation. Continuous large-scale slantwise ascent in so-called "equatorward ascending" warm conveyor belts (WCBs) associated with these cyclones is found as the key process that caused the 4 day heavy precipitation period. Fed by moisture sources from continental evapotranspiration, these WCBs unusually ascended equatorward along the southward sloping moist isentropes. Although "equatorward ascending" WCBs are climatologically rare events, they have great potential for causing high impact weather.

131 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigate the sensitivity of model selection and quality to different sample sizes in light of the following problem: on the one hand, a sample has to be large enough to cover the variability of geofactors within the study area, and to yield stable and reproducible results; on the other hand, the sample must not be too large, because a large sample is likely to violate the assumption of independent observations due to spatial autocorrelation.
Abstract: . Predictive spatial modelling is an important task in natural hazard assessment and regionalisation of geomorphic processes or landforms. Logistic regression is a multivariate statistical approach frequently used in predictive modelling; it can be conducted stepwise in order to select from a number of candidate independent variables those that lead to the best model. In our case study on a debris flow susceptibility model, we investigate the sensitivity of model selection and quality to different sample sizes in light of the following problem: on the one hand, a sample has to be large enough to cover the variability of geofactors within the study area, and to yield stable and reproducible results; on the other hand, the sample must not be too large, because a large sample is likely to violate the assumption of independent observations due to spatial autocorrelation. Using stepwise model selection with 1000 random samples for a number of sample sizes between n = 50 and n = 5000, we investigate the inclusion and exclusion of geofactors and the diversity of the resulting models as a function of sample size; the multiplicity of different models is assessed using numerical indices borrowed from information theory and biodiversity research. Model diversity decreases with increasing sample size and reaches either a local minimum or a plateau; even larger sample sizes do not further reduce it, and they approach the upper limit of sample size given, in this study, by the autocorrelation range of the spatial data sets. In this way, an optimised sample size can be derived from an exploratory analysis. Model uncertainty due to sampling and model selection, and its predictive ability, are explored statistically and spatially through the example of 100 models estimated in one study area and validated in a neighbouring area: depending on the study area and on sample size, the predicted probabilities for debris flow release differed, on average, by 7 to 23 percentage points. In view of these results, we argue that researchers applying model selection should explore the behaviour of the model selection for different sample sizes, and that consensus models created from a number of random samples should be given preference over models relying on a single sample.

126 citations


Journal ArticleDOI
TL;DR: In this paper, the main dynamical processes responsible for the onset, lifecycle, intensity and localisation/propagation of the precipitating systems, using the ISAC convection-permitting model MOLOCH applied at different spatial resolutions and comparing model output fields with available observations.
Abstract: . The Liguria coastal region in Italy was affected by two heavy rainfall episodes and subsequent severe flooding that occurred at the end of October and the beginning of November 2011. In both cases, the very large accumulated precipitation maxima were associated with intense and quasi-stationary convective systems that developed near the coast, both related to orographic lift and similar low-level mesoscale flow patterns over the Ligurian Sea, giving rise to pronounced convergence lines. This study aims at analysing the main dynamical processes responsible for the onset, lifecycle, intensity and localisation/propagation of the precipitating systems, using the ISAC convection-permitting model MOLOCH applied at different spatial resolutions and comparing model output fields with available observations. The ability of the model in quantitative precipitation forecasting (QPF) is tested with respect to initial conditions and model horizontal resolution. Although precipitation maxima remain underestimated in the model experiments, it is shown that errors in QPF in both amount and position tend to decrease with increasing grid resolution. It is shown that model accuracy in forecasting rainfall amounts and localisation of the precipitating systems critically depends on the ability to represent the cold air outflow from the Po Valley to the Ligurian Sea, which determines the position and intensity of the mesoscale convergence lines over the sea. Such convergence lines controls, together with the lifting produced by the Apennines chain surrounding the coast, the onset of the severe convection.

126 citations


Journal ArticleDOI
TL;DR: The XWS (eXtreme WindStorms) catalogue consists of storm tracks and model-generated maximum 3 s wind-gust footprints for 50 of the most extreme winter windstorms to hit Europe in the period 1979-2012 as discussed by the authors.
Abstract: . The XWS (eXtreme WindStorms) catalogue consists of storm tracks and model-generated maximum 3 s wind-gust footprints for 50 of the most extreme winter windstorms to hit Europe in the period 1979–2012. The catalogue is intended to be a valuable resource for both academia and industries such as (re)insurance, for example allowing users to characterise extreme European storms, and validate climate and catastrophe models. Several storm severity indices were investigated to find which could best represent a list of known high-loss (severe) storms. The best-performing index was Sft, which is a combination of storm area calculated from the storm footprint and maximum 925 hPa wind speed from the storm track. All the listed severe storms are included in the catalogue, and the remaining ones were selected using Sft. A comparison of the model footprint to station observations revealed that storms were generally well represented, although for some storms the highest gusts were underestimated. Possible reasons for this underestimation include the model failing to simulate strong enough pressure gradients and not representing convective gusts. A new recalibration method was developed to estimate the true distribution of gusts at each grid point and correct for this underestimation. The recalibration model allows for storm-to-storm variation which is essential given that different storms have different degrees of model bias. The catalogue is available at http://www.europeanwindstorms.org .

117 citations


Journal ArticleDOI
TL;DR: In this paper, a new systemic paradigm for the assessment of flood hazard and flood risk in the riverine flood-prone areas is presented, with special emphasis on the urban areas with mild terrain and complicated topography.
Abstract: . Natural hazards have caused severe consequences to the natural, modified and human systems in the past. These consequences seem to increase with time due to both the higher intensity of the natural phenomena and the higher value of elements at risk. Among the water-related hazards, flood hazards have the most destructive impacts. The paper presents a new systemic paradigm for the assessment of flood hazard and flood risk in the riverine flood-prone areas. Special emphasis is given to the urban areas with mild terrain and complicated topography, in which 2-D fully dynamic flood modelling is proposed. Further, the EU flood directive is critically reviewed and examples of its implementation are presented. Some critical points in the flood directive implementation are also highlighted.

117 citations


Journal ArticleDOI
TL;DR: Crowdsourced photos and volunteered geographic data are fused together using a geostatistical interpolation to create an estimation of flood damage in New York City following Hurricane Sandy.
Abstract: . This research proposes a methodology that leverages non-authoritative data to augment flood extent mapping and the evaluation of transportation infrastructure. The novelty of this approach is the application of freely available, non-authoritative data and its integration with established data and methods. Crowdsourced photos and volunteered geographic data are fused together using a geostatistical interpolation to create an estimation of flood damage in New York City following Hurricane Sandy. This damage assessment is utilized to augment an authoritative storm surge map as well as to create a road damage map for the affected region.

114 citations


Journal ArticleDOI
Stephan Pfahl1
TL;DR: In this paper, the authors show that extreme weather events in Europe are closely linked to anomalies of the atmospheric circulation and in particular to circulation features like cyclones and atmospheric blocking, which can serve as a dynamical fingerprint of the extreme events and yield insights into their most important physical driving mechanisms.
Abstract: . Extreme weather events in Europe are closely linked to anomalies of the atmospheric circulation and in particular to circulation features like cyclones and atmospheric blocking. In this study, this linkage is systematically characterised with the help of conditional cyclone and blocking frequencies during precipitation, wind gust and temperature extremes at various locations in Europe. Such conditional frequency fields can serve as a dynamical fingerprint of the extreme events and yield insights into their most important physical driving mechanisms. Precipitation extremes over the ocean and over flat terrain are shown to be closely related to cyclones in the vicinity and the associated dynamical lifting. For extreme precipitation over complex terrain, cyclone anomalies are found at more remote locations, favouring the flow of moist air towards the topography. Wind gust extremes are associated with cyclone and blocking anomalies in opposite directions, with the cyclones occurring mostly over the North and Baltic seas for extreme events in central Europe. This setting is associated with pronounced surface pressure gradients and thus high near-surface wind velocities. Hot temperature extremes in northern and central Europe typically occur in the vicinity of a blocking anticyclone, where subsidence and radiative forcing are strong. Over southern Europe, blocking anomalies are shifted more to the north or northeast, indicating a more important role of warm air advection. Large-scale flow conditions for cold extremes are similar at many locations in Europe, with blocking anomalies over the North Atlantic and northern Europe and cyclone anomalies southeast of the cold extreme, both contributing to the advection of cold air masses. This characterisation of synoptic-scale forcing mechanisms can be helpful for better understanding and anticipating weather extremes and their long-term changes.

Journal ArticleDOI
G. Anderson1, D. Klugmann1
TL;DR: In this paper, the Arrival Time Differing NETwork (ATDnet) is used to create data sets of lightning density across Europe, and results of annual and monthly detected lightning density using data from 2008-2012 are presented, along with more detailed analysis of statistics and features of interest.
Abstract: . The Met Office has operated a very low frequency (VLF) lightning location network since 1987. The long-range capabilities of this network, referred to in its current form as ATDnet, allow for relatively continuous detection efficiency across Europe with only a limited number of sensors. The wide coverage and continuous data obtained by Arrival Time Differing NETwork (ATDnet) are here used to create data sets of lightning density across Europe. Results of annual and monthly detected lightning density using data from 2008–2012 are presented, along with more detailed analysis of statistics and features of interest. No adjustment has been made to the data for regional variations in detection efficiency.

Journal ArticleDOI
TL;DR: This study presents an application of genetic algorithm and support vector machine (GA–SVM) method with parameter optimization in landslide displacement rate prediction and shows that the four models have high prediction accuracies, but the accuracies of GA-SVM models are slightly higher than those of SVM models.
Abstract: . Prediction of the landslide development process is always a hot issue in landslide research. So far, many methods for landslide displacement series prediction have been proposed. The support vector machine (SVM) has been proved to be a novel algorithm with good performance. However, the performance strongly depends on the right selection of the parameters (C and γ) of the SVM model. In this study, we present an application of genetic algorithm and support vector machine (GA–SVM) method with parameter optimization in landslide displacement rate prediction. We selected a typical large-scale landslide in a hydro-electrical engineering area of southwest China as a case. On the basis of analyzing the basic characteristics and monitoring data of the landslide, a single-factor GA–SVM model and a multi-factor GA–SVM model of the landslide were built. Moreover, the models were compared with single-factor and multi-factor SVM models of the landslide. The results show that the four models have high prediction accuracies, but the accuracies of GA–SVM models are slightly higher than those of SVM models, and the accuracies of multi-factor models are slightly higher than those of single-factor models for the landslide prediction. The accuracy of the multi-factor GA–SVM models is the highest, with the smallest root mean square error (RMSE) of 0.0009 and the highest relation index (RI) of 0.9992.

Journal ArticleDOI
TL;DR: In this paper, a catalogue of 186 rainfall events that resulted in 251 shallow landslides in Calabria, southern Italy, from January 1996 to September 2011 was compiled, and the landslide information, and sub-hourly rainfall measurements obtained from two complementary networks of rain gauges, were used to determine cumulated event vs. rainfall duration (ED) thresholds.
Abstract: . In many areas, rainfall is the primary trigger of landslides. Determining the rainfall conditions responsible for landslide occurrence is important, and may contribute to saving lives and properties. In a long-term national project for the definition of rainfall thresholds for possible landslide occurrence in Italy, we compiled a catalogue of 186 rainfall events that resulted in 251 shallow landslides in Calabria, southern Italy, from January 1996 to September 2011. Landslides were located geographically using Google Earth®, and were given a mapping and a temporal accuracy. We used the landslide information, and sub-hourly rainfall measurements obtained from two complementary networks of rain gauges, to determine cumulated event vs. rainfall duration (ED) thresholds for Calabria. For this purpose, we adopted an existing method used to prepare rainfall thresholds and to estimate their associated uncertainties in central Italy. The regional thresholds for Calabria were found to be nearly identical to previous ED thresholds for Calabria obtained using a reduced set of landslide information, and slightly higher than the ED thresholds obtained for central Italy. We segmented the regional catalogue of rainfall events with landslides in Calabria into lithology, soil regions, rainfall zones, and seasonal periods. The number of events in each subdivision was insufficient to determine reliable thresholds, but allowed for preliminary conclusions about the role of the environmental factors in the rainfall conditions responsible for shallow landslides in Calabria. We further segmented the regional catalogue based on administrative subdivisions used for hydro-meteorological monitoring and operational flood forecasting, and we determined separate ED thresholds for the Tyrrhenian and the Ionian coasts of Calabria. We expect the ED rainfall thresholds for Calabria to be used in regional and national landslide warning systems. The thresholds can also be used for landslide hazard and risk assessments, and for erosion and landscape evolution studies, in the study area and in similar physiographic regions in the Mediterranean area.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed an approach to develop rainfall thresholds to be used in civil protection warning systems for the occurrence of landslides at regional scale (i.e. tens of thousands of kilometres), and applied it to Tuscany, Italy (23 000 km2).
Abstract: . We propose an original approach to develop rainfall thresholds to be used in civil protection warning systems for the occurrence of landslides at regional scale (i.e. tens of thousands of kilometres), and we apply it to Tuscany, Italy (23 000 km2). Purpose-developed software is used to define statistical intensity–duration rainfall thresholds by means of an automated and standardized analysis of rainfall data. The automation and standardization of the analysis brings several advantages that in turn have a positive impact on the applicability of the thresholds to operational warning systems. Moreover, the possibility of defining a threshold in very short times compared to traditional analyses allowed us to subdivide the study area into several alert zones to be analysed independently, with the aim of setting up a specific threshold for each of them. As a consequence, a mosaic of several local rainfall thresholds is set up in place of a single regional threshold. Even if pertaining to the same region, the local thresholds vary substantially and can have very different equations. We subsequently analysed how the physical features of the test area influence the parameters and the equations of the local thresholds, and found that some threshold parameters can be put in relation with the prevailing lithology. In addition, we investigated the possible relations between effectiveness of the threshold and number of landslides used for the calibration. A validation procedure and a quantitative comparison with some literature thresholds showed that the performance of a threshold can be increased if the areal extent of its test area is reduced, as long as a statistically significant landslide sample is present. In particular, we demonstrated that the effectiveness of a warning system can be significantly enhanced if a mosaic of site-specific thresholds is used instead of a single regional threshold.

Journal ArticleDOI
TL;DR: The Support to Aviation Control Service (SACS, http://sacs.aeronomie.be) is a free online service initiated by ESA for the near real-time (NRT) satellite monitoring of volcanic plumes of SO2 and ash as discussed by the authors.
Abstract: Volcanic eruptions emit plumes of ash and gases in the atmosphere, potentially at very high altitudes. Ash rich plumes are hazardous for airplanes as ash is very abrasive and easily melts inside their engines. With more than 50 active volcanoes per year and the ever increasing number of commercial flights, the safety of airplanes is a real concern. Satellite measurements are ideal for monitoring global volcanic activity and, in combination with atmospheric dispersion models, to track and forecast volcanic plumes. Here we present the Support to Aviation Control Service (SACS, http://sacs.aeronomie.be), which is a free online service initiated by ESA for the near real-time (NRT) satellite monitoring of volcanic plumes of SO2 and ash. It combines data from two UV-visible (OMI, GOME-2) and two infrared (AIRS, IASI) spectrometers. This new multi-sensor warning system of volcanic plumes, running since April 2012, is based on the detection of SO2 and is optimised to avoid false alerts while at the same time limiting the number of notifications in case of large plumes. The system shows successful results with 95% of our notifications corresponding to true volcanic activity.

Journal ArticleDOI
TL;DR: This paper conducted two national surveys to measure the perception of landslide and flood risk amongst the population of Italy and found that the public fears technological risks more than natural risks, and that the occurrence of recent damaging events influenced risk perception locally, and the perception persisted longer for earthquakes and decreased more rapidly for landslides and floods.
Abstract: . Inundations and landslides are widespread phenomena in Italy, where they cause severe damage and pose a threat to the population. Little is known about the public perception of landslide and flood risk. This is surprising, as an accurate perception is important for the successful implementation of many risk reduction or adaptation strategies. In an attempt to address this gap, we have conducted two national surveys to measure the perception of landslide and flood risk amongst the population of Italy. The surveys were conducted in 2012 and 2013, and consisted of approximately 3100 computer-assisted telephone interviews for each survey. The samples of the interviewees were statistically representative for a national-scale quantitative assessment. The interviewees were asked questions designed to obtain information on (i) their perception of natural, environmental, and technological risks, (ii) direct experience or general knowledge of the occurrence of landslides and floods in their municipality, (iii) perception of the possible threat posed by landslides and floods to their safety, (iv) general knowledge on the number of victims affected by landslides or floods, and on (v) the factors that the interviewees considered important for controlling landslide and flood risks in Italy. The surveys revealed that the population of Italy fears technological risks more than natural risks. Of the natural risks, earthquakes were considered more dangerous than floods, landslides, and volcanic eruptions. Examination of the temporal and geographical distributions of the responses revealed that the occurrence of recent damaging events influenced risk perception locally, and that the perception persisted longer for earthquakes and decreased more rapidly for landslides and floods. We explain the difference by the diverse consequences of the risks. The interviewees considered inappropriate land management the main cause of landslide and food risk, followed by illegal construction, abandonment of the territory, and climate change. Comparison of the risk perception with actual measures of landslide and flood risk, including the number of fatal events, the number of fatalities, and the mortality rates, revealed that in most of the Italian regions, the perception of the threat did not match the long-term risk posed to the population by landslides and floods. This outcome points to a need to foster an understanding of the public towards landslide and flood hazards and risks in Italy.

Journal ArticleDOI
TL;DR: In this paper, the impact of the data set quality for landslide susceptibility mapping using multivariate statistical modelling methods at detailed scale is assessed, and the best maps obtained with each set of data are compared on the basis of different statistical accuracy indicators (ROC curves and relative error calculation), linear cross correlation and expert opinion.
Abstract: . This paper aims at assessing the impact of the data set quality for landslide susceptibility mapping using multivariate statistical modelling methods at detailed scale. This research is conducted on the Pays d'Auge plateau (Normandy, France) with a scale objective of 1 / 10 000, in order to fit the French guidelines on risk assessment. Five sets of data of increasing quality (considering accuracy, scale fitting, and geomorphological significance) and cost of acquisition are used to map the landslide susceptibility using logistic regression. The best maps obtained with each set of data are compared on the basis of different statistical accuracy indicators (ROC curves and relative error calculation), linear cross correlation and expert opinion. The results highlight that only high-quality sets of data supplied with detailed geomorphological variables (i.e. field inventory and surficial formation maps) can predict a satisfying proportion of landslides in the study area.

Journal ArticleDOI
TL;DR: In this paper, a linear regression model is applied to estimate the frequency and magnitude of extreme events in the watershed of a stream and a catchment-average daily rainfall in a two-parameter log-normal distribution.
Abstract: When designing or maintaining an hydraulic structure, an estimate of the frequency and magnitude of extreme events is required. The most common methods to obtain such estimates rely on the assumption of stationarity, i.e. the assumption that the stochastic process under study is not changing. The public perception and worry of a changing climate have led to a wide debate on the validity of this assumption. In this work trends for annual and seasonal maxima in peak river flow and catchment-average daily rainfall are explored. Assuming a two-parameter log-normal distribution, a linear regression model is applied, allowing the mean of the distribution to vary with time. For the river flow data, the linear model is extended to include an additional variable, the 99th percentile of the daily rainfall for a year. From the fitted models, dimensionless magnification factors are estimated and plotted on a map, shedding light on whether or not geographical coherence can be found in the significant changes. The implications of the identified trends from a decision-making perspective are then discussed, in particular with regard to the Type I and Type II error probabilities. One striking feature of the estimated trends is that the high variability found in the data leads to very inconclusive test results. Indeed, for most stations it is impossible to make a statement regarding whether or not the current design standards for the 2085 horizon can be considered precautionary. The power of tests on trends is further discussed in the light of statistical power analysis and sample size calculations. Given the observed variability in the data, sample sizes of some hundreds of years would be needed to confirm or negate the current safety margins when using at-site analysis.

Journal ArticleDOI
TL;DR: In this paper, a new procedure for data collection and storage is proposed for post-event damage assessment in Italy, which can serve numerous purposes: to create a reliable and consistent database on the basis of which damage models can be defined or validated; and to supply a comprehensive scenario of flooding impacts according to which priorities can be identified during the emergency and recovery phase, and the compensation due to citizens from insurers or local authorities can be established.
Abstract: . In recent years, awareness of a need for more effective disaster data collection, storage, and sharing of analyses has developed in many parts of the world. In line with this advance, Italian local authorities have expressed the need for enhanced methods and procedures for post-event damage assessment in order to obtain data that can serve numerous purposes: to create a reliable and consistent database on the basis of which damage models can be defined or validated; and to supply a comprehensive scenario of flooding impacts according to which priorities can be identified during the emergency and recovery phase, and the compensation due to citizens from insurers or local authorities can be established. This paper studies this context, and describes ongoing activities in the Umbria and Sicily regions of Italy intended to identifying new tools and procedures for flood damage data surveys and storage in the aftermath of floods. In the first part of the paper, the current procedures for data gathering in Italy are analysed. The analysis shows that the available knowledge does not enable the definition or validation of damage curves, as information is poor, fragmented, and inconsistent. A new procedure for data collection and storage is therefore proposed. The entire analysis was carried out at a local level for the residential and commercial sectors only. The objective of the next steps for the research in the short term will be (i) to extend the procedure to other types of damage, and (ii) to make the procedure operational with the Italian Civil Protection system. The long-term aim is to develop specific depth–damage curves for Italian contexts.

Journal ArticleDOI
TL;DR: The Mediterranean Cyclones Study Project (MCP) as discussed by the authors is an example of such a project and has been considered as a part of the MEDEX (Mediterranean Experiment for Meteorological and Environmental Prediction).
Abstract: The general objective of the international MEDiterranean EXperiment (MEDEX) was the better under- standing and forecasting of cyclones that produce high im- pact weather in the Mediterranean. This paper reviews the motivation and foundation of MEDEX, the gestation, history and organisation of the project, as well as the main prod- ucts and scientific achievements obtained from it. MEDEX obtained the approval of World Meteorological Organisa- tion (WMO) and can be considered as framed within other WMO actions, such as the ALPine EXperiment (ALPEX), the Mediterranean Cyclones Study Project (MCP) and, to a certain extent, THe Observing System Research and Pre- dictability EXperiment (THORPEX) and the HYdrological cycle in Mediterranean EXperiment (HyMeX). Through two phases (2000-2005 and 2006-2010), MEDEX has produced a specific database, with information about cyclones and se- vere or high impact weather events, several main reports and a specific data targeting system field campaign (DTS- MEDEX-2009). The scientific achievements are significant in fields like climatology, dynamical understanding of the physical processes and social impact of cyclones, as well as in aspects related to the location of sensitive zones for indi- vidual cases, the climatology of sensitivity zones and the im- provement of the forecasts through innovative methods like mesoscale ensemble prediction systems.

Journal ArticleDOI
TL;DR: A wildfire spread simulator combined with an ensemble-based DA algorithm is therefore a promising approach to reduce uncertainties in the forecast position of the fire front and to introduce a paradigm-shift in the wildfire emergency response.
Abstract: . This paper is the first part in a series of two articles and presents a data-driven wildfire simulator for forecasting wildfire spread scenarios, at a reduced computational cost that is consistent with operational systems. The prototype simulator features the following components: an Eulerian front propagation solver FIREFLY that adopts a regional-scale modeling viewpoint, treats wildfires as surface propagating fronts, and uses a description of the local rate of fire spread (ROS) as a function of environmental conditions based on Rothermel's model; a series of airborne-like observations of the fire front positions; and a data assimilation (DA) algorithm based on an ensemble Kalman filter (EnKF) for parameter estimation. This stochastic algorithm partly accounts for the nonlinearities between the input parameters of the semi-empirical ROS model and the fire front position, and is sequentially applied to provide a spatially uniform correction to wind and biomass fuel parameters as observations become available. A wildfire spread simulator combined with an ensemble-based DA algorithm is therefore a promising approach to reduce uncertainties in the forecast position of the fire front and to introduce a paradigm-shift in the wildfire emergency response. In order to reduce the computational cost of the EnKF algorithm, a surrogate model based on a polynomial chaos (PC) expansion is used in place of the forward model FIREFLY in the resulting hybrid PC-EnKF algorithm. The performance of EnKF and PC-EnKF is assessed on synthetically generated simple configurations of fire spread to provide valuable information and insight on the benefits of the PC-EnKF approach, as well as on a controlled grassland fire experiment. The results indicate that the proposed PC-EnKF algorithm features similar performance to the standard EnKF algorithm, but at a much reduced computational cost. In particular, the re-analysis and forecast skills of DA strongly relate to the spatial and temporal variability of the errors in the ROS model parameters.

Journal ArticleDOI
TL;DR: In this article, the authors evaluated the seismic vulnerability of school buildings in Tehran city based on the analytic hierarchy process (AHP) and geographical information system (GIS) and found that only 72 (about 3%) out of 2125 school buildings of the study area will have a very high seismic vulnerability.
Abstract: . The objective of the current study is to evaluate the seismic vulnerability of school buildings in Tehran city based on the analytic hierarchy process (AHP) and geographical information system (GIS). To this end, the peak ground acceleration, slope, and soil liquefaction layers were utilized for developing a geotechnical map. Also, the construction materials of structures, age of construction, the quality, and the seismic resonance coefficient layers were defined as major factors affecting the structural vulnerability of school buildings. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion via pairwise comparison in all levels. Finally, the geotechnical and structural spatial layers were overlaid to develop the seismic vulnerability map of school buildings in Tehran. The results indicated that only in 72 (about 3%) out of 2125 school buildings of the study area will the destruction rate be very high and therefore their reconstruction should seriously be considered.

Journal ArticleDOI
TL;DR: In this article, the authors highlight the hydrological effects of multi-temporal land use changes in flood hazard within the Yialias catchment area, located in central Cyprus.
Abstract: Floods are one of the most common natural disasters worldwide, leading to economic losses and loss of human lives This paper highlights the hydrological effects of multi-temporal land use changes in flood hazard within the Yialias catchment area, located in central Cyprus A calibrated hydrological model was firstly developed to describe the hydrological processes and internal basin dynamics of the three major subbasins, in order to study the diachronic effects of land use changes For the implementation of the hydrological model, land use, soil and hydrometeorological data were incorporated The climatic and stream flow data were derived from rain and flow gauge stations located in the wider area of the watershed basin In addition, the land use and soil data were extracted after the application of object-oriented nearest neighbor algorithms of ASTER satellite images Subsequently, the cellular automata (CA)–Markov chain analysis was implemented to predict the 2020 land use/land cover (LULC) map and incorporate it to the hydrological impact assessment The results denoted the increase of runoff in the catchment area due to the recorded extensive urban sprawl phenomenon of the last decade

Journal ArticleDOI
TL;DR: In this paper, the authors show how comprehensive property level information can be used for the assessment of exposure to flooding on a national scale, and how this information provides valuable input to discussions on possible risk financing practices.
Abstract: . The effectiveness of disaster risk management and financing mechanisms depends on an accurate assessment of current and future hazard exposure. The increasing availability of detailed data offers policy makers and the insurance sector new opportunities to understand trends in risk, and to make informed decisions on ways to deal with these trends. In this paper we show how comprehensive property level information can be used for the assessment of exposure to flooding on a national scale, and how this information provides valuable input to discussions on possible risk financing practices. The case study used is the Netherlands, which is one of the countries most exposed to flooding globally, and which is currently undergoing a debate on strategies for the compensation of potential losses. Our results show that flood exposure has increased rapidly between 1960 and 2012, and that the growth of the building stock and its economic value in flood-prone areas has been higher than in non-flood-prone areas. We also find that property values in flood-prone areas are lower than those in non-flood-prone areas. We argue that the increase in the share of economic value located in potential flood-prone areas can have a negative effect on the feasibility of private insurance schemes in the Netherlands. The methodologies and results presented in this study are relevant for many regions around the world where the effects of rising flood exposure create a challenge for risk financing.

Journal ArticleDOI
TL;DR: In this paper, the authors present a conceptual framework for threshold selection in over-threshold modeling, reviewing the available methods for the application of both steps and illustrating it with a double threshold approach.
Abstract: . The evaluation of the probability of occurrence of extreme natural events is important for the protection of urban areas, industrial facilities and others. Traditionally, the extreme value theory (EVT) offers a valid theoretical framework on this topic. In an over-threshold modelling (OTM) approach, Pickands' theorem, (Pickands, 1975) states that, for a sample composed by independent and identically distributed (i.i.d.) values, the distribution of the data exceeding a given threshold converges through a generalized Pareto distribution (GPD). Following this theoretical result, the analysis of realizations of environmental variables exceeding a threshold spread widely in the literature. However, applying this theorem to an auto-correlated time series logically involves two successive and complementary steps: the first one is required to build a sample of i.i.d. values from the available information, as required by the EVT; the second to set the threshold for the optimal convergence toward the GPD. In the past, the same threshold was often employed both for sampling observations and for meeting the hypothesis of extreme value convergence. This confusion can lead to an erroneous understanding of methodologies and tools available in the literature. This paper aims at clarifying the conceptual framework involved in threshold selection, reviewing the available methods for the application of both steps and illustrating it with a double threshold approach.

Journal ArticleDOI
TL;DR: In this paper, the authors used the Canadian Fire Weather Index (FWI), calculated from two reanalysis data sets, ERA-40 and ERA Interim, to examine the temporal variation of forest fire danger in Europe in 1960-2012.
Abstract: . Understanding how fire weather danger indices changed in the past and how such changes affected forest fire activity is important in a changing climate. We used the Canadian Fire Weather Index (FWI), calculated from two reanalysis data sets, ERA-40 and ERA Interim, to examine the temporal variation of forest fire danger in Europe in 1960–2012. Additionally, we used national forest fire statistics from Greece, Spain and Finland to examine the relationship between fire danger and fires. There is no obvious trend in fire danger for the time period covered by ERA-40 (1960–1999), whereas for the period 1980–2012 covered by ERA Interim, the mean FWI shows an increasing trend for southern and eastern Europe which is significant at the 99% confidence level. The cross correlations calculated at the national level in Greece, Spain and Finland between total area burned and mean FWI of the current season is of the order of 0.6, demonstrating the extent to which the current fire-season weather can explain forest fires. To summarize, fire risk is multifaceted, and while climate is a major determinant, other factors can contribute to it, either positively or negatively.

Journal ArticleDOI
TL;DR: In this article, the authors applied an econometric evaluation technique called propensity score matching (PSM) to a survey of German households along three major rivers that were flooded in 2002, 2005, and 2006.
Abstract: . The employment of damage mitigation measures (DMMs) by individuals is an important component of integrated flood risk management. In order to promote efficient damage mitigation measures, accurate estimates of their damage mitigation potential are required. That is, for correctly assessing the damage mitigation measures' effectiveness from survey data, one needs to control for sources of bias. A biased estimate can occur if risk characteristics differ between individuals who have, or have not, implemented mitigation measures. This study removed this bias by applying an econometric evaluation technique called propensity score matching (PSM) to a survey of German households along three major rivers that were flooded in 2002, 2005, and 2006. The application of this method detected substantial overestimates of mitigation measures' effectiveness if bias is not controlled for, ranging from nearly EUR 1700 to 15 000 per measure. Bias-corrected effectiveness estimates of several mitigation measures show that these measures are still very effective since they prevent between EUR 6700 and 14 000 of flood damage per flood event. This study concludes with four main recommendations regarding how to better apply propensity score matching in future studies, and makes several policy recommendations.

Journal ArticleDOI
TL;DR: In this article, a shallow water location is selected for the study since measured buoy data are available close to the location for comparison with the reanalysis data, and the authors studied the temporal variations in wind speed and significant wave height (SWH) at a location in the eastern Arabian Sea using ERA-Interim reanalysis from 1979 to 2012.
Abstract: . Temporal variations in wind speed and significant wave height (SWH) at a location in the eastern Arabian Sea are studied using ERA-Interim reanalysis data from 1979 to 2012. A shallow water location is selected for the study since measured buoy data are available close to the location for comparison with the reanalysis data. The annual mean wind speed shows a statistically significant decreasing trend of 1.5 cm s−1 year−1, whereas a statistically insignificant increasing trend of 3.6 cm s−1 year−1 is observed for annual maximum wind speed due to the local events that altered the trend in annual maximum wind speed. Weakening of SWH during one of the peak monsoon months (August) is identified from the monthly analysis of SWH, which shows a higher upward trend in SWH during the southwest monsoon period, with an exception during August. The annual mean SWH shows a slight upward trend (0.012 cm year−1), whereas a larger upward trend (1.4 cm year−1) is observed for annual maximum SWH. Both identified trends are statistically insignificant. The influence of tropical cyclone activity is also studied and it is found that the maximum SWH and wind speed during 1996 are directly related to the cyclonic event.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a scenario-based approach to assess the hazard associated with tephra dispersal and sedimentation at various scales and for multiple sources, including Hekla, Katla, Eyjafjallajokull and Askja.
Abstract: . In order to assist the elaboration of proactive measures for the management of future volcanic eruptions in Iceland, we developed a new scenario-based approach to assess the hazard associated with tephra dispersal and sedimentation at various scales and for multiple sources. The target volcanoes are Hekla, Katla, Eyjafjallajokull and Askja, selected either for their high probabilities of eruption and/or their high potential impact. By coupling tephrostratigraphic studies, probabilistic techniques and modelling, we developed comprehensive eruption scenarios for both short- and long-lasting eruptions and compiled hazard maps for tephra ground deposition at a national scale and air concentration at a European scale using the TEPHRA2 and FALL3D models, respectively. New algorithms for the identification of realistic sets of eruptive source parameters are investigated, which assist the generation of probability density functions of eruption source parameters for the selected scenarios. Aggregation processes were accounted for using various empirical models. Outcomes, i.e. probabilities conditioned to the occurrence of an eruption, help the assessment and comparison of hazard levels at different scales. For example, at a national scale Askja has a 5–10% probability of blanketing the easternmost half of the country with a tephra accumulation of at least 1 kg m−2. At a continental scale, Katla has a 5–10% probability of producing ash clouds with concentrations of 2 mg m−3 over the UK, Scandinavia and northern Europe with a mean arrival time of 48–72 h and a mean persistence time of 6–18 h. In a companion paper, Scaini et al. (2014) present a vulnerability assessment for Iceland to ground deposition of tephra and for the European air traffic to airborne ash which, combined with the outcomes of the present paper, constitute one of the first comprehensive multi-scale risk assessment associated with tephra dispersal and sedimentation.