scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Probabilistic-based structural assessment of a historic stone arch bridge

04 Mar 2021-Structure and Infrastructure Engineering (Informa UK Limited)-Vol. 17, Iss: 3, pp 379-391
TL;DR: In this article, a probabilistic analysis approach for reliability-based assessment of masonry arch bridges is presented, which is tested on a medieval in-service construction located in Gali...
Abstract: A probabilistic analysis approach for reliability-based assessment of masonry arch bridges is presented in this work. The methodology is tested on a medieval in-service construction located in Gali...
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, discontinuum-based computational models are used to simulate the composite and low-bond strength characteristics of masonry walls, and the same models are then used to determine the influence of uncertainties in the material properties on the macro behavior of the URM walls.

18 citations

Journal ArticleDOI
TL;DR: A realistic and rigorous structural assessment of masonry arch bridges is essential due to their cultural, economic and strategic importance and vulnerability.
Abstract: Seismic structural assessment of masonry arch bridges is essential due to their cultural, economic and strategic importance and vulnerability. A realistic and rigorous structural assessment is nece...

15 citations

Journal ArticleDOI
08 Oct 2020
TL;DR: In this article, a discontinuum modeling strategy based on the discrete element method was developed to investigate the tensile fracture mechanism of masonry wallettes parallel to the bed joints considering the inherent variation in the material properties.
Abstract: Nonhomogeneous material characteristics of masonry lead to complex fracture mechanisms, which require substantial analysis regarding the influence of masonry constituents. In this context, this study presents a discontinuum modeling strategy, based on the discrete element method, developed to investigate the tensile fracture mechanism of masonry wallettes parallel to the bed joints considering the inherent variation in the material properties. The applied numerical approach utilizes polyhedral blocks to represent masonry and integrate the equations of motion explicitly to compute nodal velocities for each block in the system. The mechanical interaction between the adjacent blocks is computed at the active contact points, where the contact stresses are calculated and updated based on the implemented contact constitutive models. In this research, different fracture mechanisms of masonry wallettes under tension are explored developing at the unit–mortar interface and/or within the units. The contact properties are determined based on certain statistical variations. Emphasis is given to the influence of the material properties on the fracture mechanism and capacity of the masonry assemblages. The results of the analysis reveal and quantify the importance of the contact properties for unit and unit–mortar interfaces (e.g., tensile strength, cohesion, and friction coefficient) in terms of capacity and corresponding fracture mechanism for masonry wallettes.

12 citations

Journal ArticleDOI
TL;DR: In this article , the authors presented a holistic methodology aimed at the non-destructive experimental characterization and reliability-based structural assessment of historical steel bridges, which comprehends from the experimental data acquisition to the finite element model updating and the probabilistic-based Structural Assessment to obtain the reliability indexes of serviceability and ultimate limit states.

5 citations

DOI
17 Nov 2021
TL;DR: In this article, the authors present a framework for quantitative risk assessment which guides an integrated assessment of the risk components: hazard, exposure, vulnerability and consequences of a malfunctioning transportation infrastructure.
Abstract: Keeping transport links open in adverse conditions and being able to restore connections quickly after extreme events are important and demanding tasks for infrastructure owners/operators. This paper is developed within the H2020 project SAFEWAY, whose main goal is to increase the resilience of terrestrial transportation infrastructure. Risk-based approaches are excellent tools to aid in the decision-making process of planning maintenance and implementation of risk mitigation measures with the ultimate goal of reducing risk and increasing resilience. This paper presents a framework for quantitative risk assessment which guides an integrated assessment of the risk components: hazard, exposure, vulnerability and consequences of a malfunctioning transportation infrastructure. The paper guides the identification of failure modes for transportation infrastructure exposed to extreme events (natural and human-made) and provides models for and examples of hazard, vulnerability and risk assessment. Each assessment step must be made in coherence with the other risk components as an integral part of the risk assessment.

2 citations

References
More filters
Journal ArticleDOI
TL;DR: A Bayesian calibration technique which improves on this traditional approach in two respects and attempts to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best‐fitting parameter values is presented.
Abstract: We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.

3,745 citations

Journal ArticleDOI
TL;DR: Existing and new practices for sensitivity analysis of model output are compared and recommendations on which to use are offered to help practitioners choose which techniques to use.

2,265 citations

Journal ArticleDOI
TL;DR: In this article, the authors present a Bayesian framework which unifies the various tools of probabilistic sensitivity analysis, which allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than standard Monte Carlo methods.
Abstract: Summary. In many areas of science and technology, mathematical models are built to simulate complex real world phenomena. Such models are typically implemented in large computer programs and are also very complex, such that the way that the model responds to changes in its inputs is not transparent. Sensitivity analysis is concerned with understanding how changes in the model inputs influence the outputs. This may be motivated simply by a wish to understand the implications of a complex model but often arises because there is uncertainty about the true values of the inputs that should be used for a particular application. A broad range of measures have been advocated in the literature to quantify and describe the sensitivity of a model's output to variation in its inputs. In practice the most commonly used measures are those that are based on formulating uncertainty in the model inputs by a joint probability distribution and then analysing the induced uncertainty in outputs, an approach which is known as probabilistic sensitivity analysis. We present a Bayesian framework which unifies the various tools of prob- abilistic sensitivity analysis. The Bayesian approach is computationally highly efficient. It allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than standard Monte Carlo methods. Furthermore, all measures of interest may be computed from a single set of runs.

1,074 citations

Journal ArticleDOI
TL;DR: In this article, the sensitivity of the solutions of large sets of coupled nonlinear rate equations to uncertainties in the rate coefficients is investigated, and it is shown via an application of Weyl's ergodic theorem that a subset of the Fourier coefficients is related to ∂ci/∂kl ǫ, the rate of change of the concentration of species i with respect to the rate constant for reaction l averaged over the uncertainties of all the other rate coefficients.
Abstract: A method has been developed to investigate the sensitivity of the solutions of large sets of coupled nonlinear rate equations to uncertainties in the rate coefficients. This method is based on varying all the rate coefficients simultaneously through the introduction of a parameter in such a way that the output concentrations become periodic functions of this parameter at any given time t. The concentrations of the chemical species are then Fourier analyzed at time t. We show via an application of Weyl's ergodic theorem that a subset of the Fourier coefficients is related to 〈∂ci/∂kl〉, the rate of change of the concentration of species i with respect to the rate constant for reaction l averaged over the uncertainties of all the other rate coefficients. Thus a large Fourier coefficient corresponds to a large sensitivity, and a small Fourier coefficient corresponds to a small sensitivity. The amount of numerical integration required to calculate these Fourier coefficients is considerably less than that requi...

954 citations

Book
14 Jan 2000
TL;DR: This book discusses the concepts of limit states and limit state functions, and presents methodologies for calculating reliability indices and calibrating partial safety factors, and supplies information on the probability distributions and parameters used to characterize both applied loads and member resistances.
Abstract: This book enables both students and practicing engineers to appreciate how to value and handle reliability as an important dimension of structural design. The book discusses the concepts of limit states and limit state functions, and presents methodologies for calculating reliability indices and calibrating partial safety factors. It also supplies information on the probability distributions and parameters used to characterize both applied loads and member resistances. This book contains more discussions of United States (US) and international codes and the issues underlying their development. There is a significant discussion on Monte Carlo simulation. The books' emphasis is on the practical applications of structural reliability theory rather than the theory itself. Consequently, probability theory is treated as a tool, and enough is given to show the novice reader how to calculate reliability. Some background in structural engineering and structural mechanics is assumed.

944 citations