scispace - formally typeset
Search or ask a question
Author

Susana Almeida

Bio: Susana Almeida is an academic researcher from University of Bristol. The author has contributed to research in topics: Natural hazard & Slope stability. The author has an hindex of 7, co-authored 7 publications receiving 192 citations. Previous affiliations of Susana Almeida include Imperial College London & University of Leeds.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a parameter selection process, similar to a likelihood weighting procedure, was applied for 1,023 possible combinations of 10 different data sources, ranging from using 1 to all 10 of these products.
Abstract: The calibration of hydrological models without streamflow observations is problematic, and the simultaneous, combined use of remotely sensed products for this purpose has not been exhaustively tested thus far. Our hypothesis is that the combined use of products can (1) reduce the parameter search space and (2) improve the representation of internal model dynamics and hydrological signatures. Five different conceptual hydrological models were applied to 27 catchments across Europe. A parameter selection process, similar to a likelihood weighting procedure, was applied for 1,023 possible combinations of 10 different data sources, ranging from using 1 to all 10 of these products. Distances between the two empirical distributions of model performance metrics with and without using a specific product were determined to assess the added value of a specific product. In a similar way, the performance of the models to reproduce 27 hydrological signatures was evaluated relative to the unconstrained model. Significant reductions in the parameter space were obtained when combinations included Advanced Microwave Scanning Radiometer - Earth Observing System and Advanced Scatterometer soil moisture, Gravity Recovery and Climate Experiment total water storage anomalies, and, in snow-dominated catchments, the Moderate Resolution Imaging Spectroradiometer snow cover products. The evaporation products of Land Surface Analysis - Satellite Application Facility and MOD16 were less effective for deriving meaningful, well-constrained posterior parameter distributions. The hydrological signature analysis indicated that most models profited from constraining with an increasing number of data sources. Concluding, constraining models with multiple data sources simultaneously was shown to be valuable for at least four of the five hydrological models to determine model parameters in absence of streamflow.

85 citations

Journal ArticleDOI
TL;DR: It is suggested that the use of simple aleatory distributional models, common in current practice, will underestimate the potential variability in assessing hazards, consequences, and risks.
Abstract: . This paper discusses how epistemic uncertainties are currently considered in the most widely occurring natural hazard areas, including floods, landslides and debris flows, dam safety, droughts, earthquakes, tsunamis, volcanic ash clouds and pyroclastic flows, and wind storms. Our aim is to provide an overview of the types of epistemic uncertainty in the analysis of these natural hazards and to discuss how they have been treated so far to bring out some commonalities and differences. The breadth of our study makes it difficult to go into great detail on each aspect covered here; hence the focus lies on providing an overview and on citing key literature. We find that in current probabilistic approaches to the problem, uncertainties are all too often treated as if, at some fundamental level, they are aleatory in nature. This can be a tempting choice when knowledge of more complex structures is difficult to determine but not acknowledging the epistemic nature of many sources of uncertainty will compromise any risk analysis. We do not imply that probabilistic uncertainty estimation necessarily ignores the epistemic nature of uncertainties in natural hazards; expert elicitation for example can be set within a probabilistic framework to do just that. However, we suggest that the use of simple aleatory distributional models, common in current practice, will underestimate the potential variability in assessing hazards, consequences, and risks. A commonality across all approaches is that every analysis is necessarily conditional on the assumptions made about the nature of the sources of epistemic uncertainty. It is therefore important to record the assumptions made and to evaluate their impact on the uncertainty estimate. Additional guidelines for good practice based on this review are suggested in the companion paper (Part 2).

57 citations

Journal ArticleDOI
TL;DR: In this paper, the authors used the combined hydrology and stability model (CHASM) with sensitivity analysis and classification and regression trees (CART) to identify critical thresholds in slope properties and climatic (rainfall) drivers that lead to slope failure.
Abstract: . Landslides have large negative economic and societal impacts, including loss of life and damage to infrastructure. Slope stability assessment is a vital tool for landslide risk management, but high levels of uncertainty often challenge its usefulness. Uncertainties are associated with the numerical model used to assess slope stability and its parameters, with the data characterizing the geometric, geotechnic and hydrologic properties of the slope, and with hazard triggers (e.g. rainfall). Uncertainties associated with many of these factors are also likely to be exacerbated further by future climatic and socio-economic changes, such as increased urbanization and resultant land use change. In this study, we illustrate how numerical models can be used to explore the uncertain factors that influence potential future landslide hazard using a bottom-up strategy. Specifically, we link the Combined Hydrology And Stability Model (CHASM) with sensitivity analysis and Classification And Regression Trees (CART) to identify critical thresholds in slope properties and climatic (rainfall) drivers that lead to slope failure. We apply our approach to a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. For this particular slope, we find that uncertainties regarding some slope properties (namely thickness and effective cohesion of topsoil) are as important as the uncertainties related to future rainfall conditions. Furthermore, we show that 89 % of the expected behaviour of the studied slope can be characterized based on only two variables – the ratio of topsoil thickness to cohesion and the ratio of rainfall intensity to duration.

49 citations

Journal ArticleDOI
TL;DR: There is potential for further exchange of ideas and experience between natural hazard research communities on decision analysis approaches and broader application of decision methodologies to natural hazard management and evaluation of existing decision approaches can potentially lead to more efficient allocation of scarce resources.
Abstract: Losses from natural hazards, including geophysical and hydrometeorological hazards, have been increasing worldwide. This review focuses on the process by which scientific evidence about natural hazards is applied to support decision making. Decision analysis typically involves estimating the probability of extreme events; assessing the potential impacts of those events from a variety of perspectives; and evaluating options to plan for, mitigate, or react to events. We consider issues that affect decisions made across a range of natural hazards, summarize decision methodologies, and provide examples of applications of decision analysis to the management of natural hazards. We conclude that there is potential for further exchange of ideas and experience between natural hazard research communities on decision analysis approaches. Broader application of decision methodologies to natural hazard management and evaluation of existing decision approaches can potentially lead to more efficient allocation of scarce...

48 citations

Journal ArticleDOI
TL;DR: The results show that the consideration of the inter-signature error structure may improve predictions when the error correlations are strong, but other uncertainties such as model structure and observational error may outweigh the importance of these correlations.
Abstract: . A recurrent problem in hydrology is the absence of streamflow data to calibrate rainfall–runoff models. A commonly used approach in such circumstances conditions model parameters on regionalized response signatures. While several different signatures are often available to be included in this process, an outstanding challenge is the selection of signatures that provide useful and complementary information. Different signatures do not necessarily provide independent information and this has led to signatures being omitted or included on a subjective basis. This paper presents a method that accounts for the inter-signature error correlation structure so that regional information is neither neglected nor double-counted when multiple signatures are included. Using 84 catchments from the MOPEX database, observed signatures are regressed against physical and climatic catchment attributes. The derived relationships are then utilized to assess the joint probability distribution of the signature regionalization errors that is subsequently used in a Bayesian procedure to condition a rainfall–runoff model. The results show that the consideration of the inter-signature error structure may improve predictions when the error correlations are strong. However, other uncertainties such as model structure and observational error may outweigh the importance of these correlations. Further, these other uncertainties cause some signatures to appear repeatedly to be misinformative.

24 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors analyze how earthquakes trigger landslides and highlight research gaps, and suggest pathways toward a more complete understanding of the seismic effects on the Earth's surface, highlighting research gaps.
Abstract: Large earthquakes initiate chains of surface processes that last much longer than the brief moments of strong shaking. Most moderate‐ and large‐magnitude earthquakes trigger landslides, ranging from small failures in the soil cover to massive, devastating rock avalanches. Some landslides dam rivers and impound lakes, which can collapse days to centuries later, and flood mountain valleys for hundreds of kilometers downstream. Landslide deposits on slopes can remobilize during heavy rainfall and evolve into debris flows. Cracks and fractures can form and widen on mountain crests and flanks, promoting increased frequency of landslides that lasts for decades. More gradual impacts involve the flushing of excess debris downstream by rivers, which can generate bank erosion and floodplain accretion as well as channel avulsions that affect flooding frequency, settlements, ecosystems, and infrastructure. Ultimately, earthquake sequences and their geomorphic consequences alter mountain landscapes over both human and geologic time scales. Two recent events have attracted intense research into earthquake‐induced landslides and their consequences: the magnitude M 7.6 Chi‐Chi, Taiwan earthquake of 1999, and the M 7.9 Wenchuan, China earthquake of 2008. Using data and insights from these and several other earthquakes, we analyze how such events initiate processes that change mountain landscapes, highlight research gaps, and suggest pathways toward a more complete understanding of the seismic effects on the Earth's surface.

424 citations

01 Dec 2004
TL;DR: In this article, a framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation, which involves the use of multiple conceptual models, assessment of their pedigree and reflection on the extent to which the sampled models adequately represent the space of plausible models.
Abstract: Although uncertainty about structures of environmental models (conceptual uncertainty) is often acknowledged to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis, which considers only a single conceptual model, may fail to adequately sample the relevant space of plausible conceptual models. As such, it is prone to modelling bias and underestimation of predictive uncertainty. In this paper we review a range of strategies for assessing structural uncertainties in models. The existing strategies fall into two categories depending on whether field data are available for the predicted variable of interest. To date, most research has focussed on situations where inferences on the accuracy of a model structure can be made directly on the basis of field data. This corresponds to a situation of ‘interpolation’. However, in many cases environmental models are used for ‘extrapolation’; that is, beyond the situation and the field data available for calibration. In the present paper, a framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. It involves the use of multiple conceptual models, assessment of their pedigree and reflection on the extent to which the sampled models adequately represent the space of plausible models. � 2005 Elsevier Ltd. All rights reserved.

417 citations

Journal ArticleDOI
TL;DR: A typology of four reasons for using storylines to represent uncertainty in physical aspects of climate change: improving risk awareness by framing risk in an event-oriented rather than a probabilistic manner, which corresponds more directly to how people perceive and respond to risk.
Abstract: As climate change research becomes increasingly applied, the need for actionable information is growing rapidly. A key aspect of this requirement is the representation of uncertainties. The conventional approach to representing uncertainty in physical aspects of climate change is probabilistic, based on ensembles of climate model simulations. In the face of deep uncertainties, the known limitations of this approach are becoming increasingly apparent. An alternative is thus emerging which may be called a ‘storyline’ approach. We define a storyline as a physically self-consistent unfolding of past events, or of plausible future events or pathways. No a priori probability of the storyline is assessed; emphasis is placed instead on understanding the driving factors involved, and the plausibility of those factors. We introduce a typology of four reasons for using storylines to represent uncertainty in physical aspects of climate change: (i) improving risk awareness by framing risk in an event-oriented rather than a probabilistic manner, which corresponds more directly to how people perceive and respond to risk; (ii) strengthening decision-making by allowing one to work backward from a particular vulnerability or decision point, combining climate change information with other relevant factors to address compound risk and develop appropriate stress tests; (iii) providing a physical basis for partitioning uncertainty, thereby allowing the use of more credible regional models in a conditioned manner and (iv) exploring the boundaries of plausibility, thereby guarding against false precision and surprise. Storylines also offer a powerful way of linking physical with human aspects of climate change.

274 citations

01 Apr 2009
TL;DR: In this article, the authors investigated how the clustering of wintertime extra-tropical cyclones depends on the vorticity intensity of the cyclones, and the sampling time period over which cyclone transits are counted.
Abstract: This study has investigated how the clustering of wintertime extra-tropical cyclones depends on the vorticity intensity of the cyclones, and the sampling time period over which cyclone transits are counted. Clustering is characterized by the dispersion (ratio of the variance and the mean) of the counts of eastward transits of cyclone tracks obtained by objective tracking of 850 hPa vorticity features in NCEP-NCAR reanalyses. The counts are aggregated over non-overlapping time periods lasting from 4 days up to 6 month long OctoberMarch winters over the period 1950–2003. Clustering is found to be largest in the exit region of the North Atlantic storm track (i.e. over NE Atlantic and NW Europe). Furthermore, it increases considerably for the intense cyclones, for example, the dispersion of the 3-monthly counts near Berlin increases from 1.45 for all cyclones to 1.80 for the 25 % most intense cyclones. The dispersion also increases quasi-linearly with the logarithm of the length of the aggregation period, for example, near Berlin the dispersion is 1.08, 1.33, and 1.45 for weekly, monthly, and 3-monthly totals, respectively. The increases and the sampling uncertainties in dispersion can be reproduced using a simple Poisson regression model with a time-varying rate that depends on large-scale teleconnection indices such as the North Atlantic Oscillation, the East Atlantic Pattern, the Scandinavian pattern, and the East Atlantic/West Russia pattern. Increased dispersion for intense cyclones is found to be due to the rate becoming more dependent on the indices for such cyclones, whereas increased dispersion for longer aggregation periods is related to the small amounts of intraseasonal persistence in the indices. Increased clustering with cyclone intensity and aggregation period has important implications for the accurate modelling of aggregate insurance losses. Zusammenfassung

117 citations

Journal ArticleDOI
TL;DR: In this paper, the authors review the challenges and opportunities related to the nature of oceans and to actors involved in, the scale of, and knowledge informing their governance, in relation to nine new and emerging issues: small-scale fisheries, aquaculture, biodiversity conservation on the high seas, large marine protected areas (LMPAs), tuna fisheries, deep-sea mining, ocean acidification (OA), blue carbon (BC), and plastics pollution.
Abstract: Increased interest in oceans is leading to new and renewed global governance efforts directed toward ocean issues in areas of food production, biodiversity conservation, industrialization, global environmental change, and pollution. Global oceans governance efforts face challenges and opportunities related to the nature of oceans and to actors involved in, the scale of, and knowledge informing their governance. We review these topics generally and in relation to nine new and emerging issues: small-scale fisheries (SSFs), aquaculture, biodiversity conservation on the high seas, large marine protected areas (LMPAs), tuna fisheries, deep-sea mining, ocean acidification (OA), blue carbon (BC), and plastics pollution.

115 citations