04 Mar 2021-Structure and Infrastructure Engineering (Informa UK Limited)-Vol. 17, Iss: 3, pp 379-391

Abstract: A probabilistic analysis approach for reliability-based assessment of masonry arch bridges is presented in this work. The methodology is tested on a medieval in-service construction located in Gali...

... read more

Topics: Probabilistic analysis of algorithms (54%), Reliability (statistics) (54%), Probabilistic logic (51%)

More

5 results found

•••

08 Oct 2020-

Abstract: Nonhomogeneous material characteristics of masonry lead to complex fracture mechanisms, which require substantial analysis regarding the influence of masonry constituents. In this context, this study presents a discontinuum modeling strategy, based on the discrete element method, developed to investigate the tensile fracture mechanism of masonry wallettes parallel to the bed joints considering the inherent variation in the material properties. The applied numerical approach utilizes polyhedral blocks to represent masonry and integrate the equations of motion explicitly to compute nodal velocities for each block in the system. The mechanical interaction between the adjacent blocks is computed at the active contact points, where the contact stresses are calculated and updated based on the implemented contact constitutive models. In this research, different fracture mechanisms of masonry wallettes under tension are explored developing at the unit–mortar interface and/or within the units. The contact properties are determined based on certain statistical variations. Emphasis is given to the influence of the material properties on the fracture mechanism and capacity of the masonry assemblages. The results of the analysis reveal and quantify the importance of the contact properties for unit and unit–mortar interfaces (e.g., tensile strength, cohesion, and friction coefficient) in terms of capacity and corresponding fracture mechanism for masonry wallettes.

... read more

Topics: Masonry (64%), Fracture (geology) (52%)

6 Citations

••

Abstract: In this study, lateral capacity and load-deflection behavior of unreinforced stone masonry walls are analyzed based on the discrete element method considering the uncertainties in the material properties. Through this research, discontinuum-based computational models are used to simulate the composite and low-bond strength characteristics of masonry. The unreinforced masonry walls are replicated via the system of deformable rectangular blocks interacting along their boundaries. Local failure modes of masonry (cracking, sliding, and crushing) are taken into consideration at the joints, in which the contact stresses are calculated and updated based on the relative displacement among the adjacent blocks and contact constitutive models, respectively. First, the numerical approach is experimentally validated using previous test results. The same models are then used to determine the influence of uncertainties in the material properties on the macro behavior of the URM walls. The importance of considering the inherent variability in the material and modeling parameters is highlighted. Results clearly show the informative outcomes of the stochastic analysis and provide a deeper understanding of the structural behavior of URM walls in terms of the governing failure mechanisms, displacements, and load-carrying capacities. These results also underline the importance of the variability in the force and displacement capacities, which should be considered when defining performance limits for stone masonry in the future, as they are currently unavailable in national standards for Turkey and the US.

... read more

Topics: Unreinforced masonry building (65%), Masonry (63%)

3 Citations

••

Abstract: Seismic structural assessment of masonry arch bridges is essential due to their cultural, economic and strategic importance and vulnerability. A realistic and rigorous structural assessment is nece...

... read more

Topics: First-order reliability method (69%), Reliability (statistics) (59%)

3 Citations

••

15 Apr 2021-

Abstract: This paper presents the design and development of a structural health monitoring (SHM) system specifically tailored for transportation infrastructure components such as bridges. If focuses mainly the application of statistical machine learning (ML) algorithms to classify deformation datasets of a bridge. A model of a steel bridge was constructed and contactless sensors were placed to collect deformation data. Four loads were applied at each of the pre-defined four locations identified to represent heavy loads across the real bridge. Computer simulation in ANSYS and application of gradient boosting neural networks were performed to produce a comparative and predictive analysis of the behavior of transportation infrastructures, which can be used to understand the health of the structure and make informed decisions. Deformation levels at 100 critical locations on the bridge model were collected in each experiment by using sensors. The experiments were repeated to get average data for processing. Python programming language was used for coding and the analysis was performed in a Google Collaboratory Notebook. Development and training of the models were done using the Pycaret, which is a Python based framework that supports a variety of ML tools. Performance of each ML technique was evaluated by means of the accuracy. The final is capable of simulating multiple load conditions on structures, identifying possible failure points, and detecting and predicting failure scenarios. Both hardware and software implementations of a model of a bridge were performed as a pilot project to validate the proposed system.

... read more

Topics: Structural health monitoring (54%), Python (programming language) (53%), Bridge (nautical) (52%) ... read more

••

17 Nov 2021-

Abstract: Keeping transport links open in adverse conditions and being able to restore connections quickly after extreme events are important and demanding tasks for infrastructure owners/operators. This paper is developed within the H2020 project SAFEWAY, whose main goal is to increase the resilience of terrestrial transportation infrastructure. Risk-based approaches are excellent tools to aid in the decision-making process of planning maintenance and implementation of risk mitigation measures with the ultimate goal of reducing risk and increasing resilience. This paper presents a framework for quantitative risk assessment which guides an integrated assessment of the risk components: hazard, exposure, vulnerability and consequences of a malfunctioning transportation infrastructure. The paper guides the identification of failure modes for transportation infrastructure exposed to extreme events (natural and human-made) and provides models for and examples of hazard, vulnerability and risk assessment. Each assessment step must be made in coherence with the other risk components as an integral part of the risk assessment.

... read more

Topics: Risk management (63%), Risk assessment (61%), Hazard (56%) ... read more

More

56 results found

•••

Abstract: We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.

... read more

Topics: Calibration (statistics) (57%), Uncertainty analysis (57%), Gaussian process emulator (55%) ... read more

3,190 Citations

••

Andrea Saltelli^{1}, Paola Annoni^{1}, Ivano Azzini^{1}, Francesca Campolongo^{1} +2 more•Institutions (1)

Abstract: Variance based methods have assessed themselves as versatile and effective among the various available techniques for sensitivity analysis of model output. Practitioners can in principle describe the sensitivity pattern of a model Y = f ( X 1 , X 2 , … , X k ) with k uncertain input factors via a full decomposition of the variance V of Y into terms depending on the factors and their interactions. More often practitioners are satisfied with computing just k first order effects and k total effects, the latter describing synthetically interactions among input factors. In sensitivity analysis a key concern is the computational cost of the analysis, defined in terms of number of evaluations of f ( X 1 , X 2 , … , X k ) needed to complete the analysis, as f ( X 1 , X 2 , … , X k ) is often in the form of a numerical model which may take long processing time. While the computational cost is relatively cheap and weakly dependent on k for estimating first order effects, it remains expensive and strictly k-dependent for total effect indices. In the present note we compare existing and new practices for this index and offer recommendations on which to use.

... read more

Topics: Variance-based sensitivity analysis (63%), Morris method (56%), Elementary effects method (55%) ... read more

1,763 Citations

•••

Abstract: Summary. In many areas of science and technology, mathematical models are built to simulate complex real world phenomena. Such models are typically implemented in large computer programs and are also very complex, such that the way that the model responds to changes in its inputs is not transparent. Sensitivity analysis is concerned with understanding how changes in the model inputs influence the outputs. This may be motivated simply by a wish to understand the implications of a complex model but often arises because there is uncertainty about the true values of the inputs that should be used for a particular application. A broad range of measures have been advocated in the literature to quantify and describe the sensitivity of a model's output to variation in its inputs. In practice the most commonly used measures are those that are based on formulating uncertainty in the model inputs by a joint probability distribution and then analysing the induced uncertainty in outputs, an approach which is known as probabilistic sensitivity analysis. We present a Bayesian framework which unifies the various tools of prob- abilistic sensitivity analysis. The Bayesian approach is computationally highly efficient. It allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than standard Monte Carlo methods. Furthermore, all measures of interest may be computed from a single set of runs.

... read more

Topics: Elementary effects method (70%), Gaussian process emulator (62%), Sensitivity analysis (60%) ... read more

981 Citations

••

14 Jan 2000-

Abstract: This book enables both students and practicing engineers to appreciate how to value and handle reliability as an important dimension of structural design. The book discusses the concepts of limit states and limit state functions, and presents methodologies for calculating reliability indices and calibrating partial safety factors. It also supplies information on the probability distributions and parameters used to characterize both applied loads and member resistances. This book contains more discussions of United States (US) and international codes and the issues underlying their development. There is a significant discussion on Monte Carlo simulation. The books' emphasis is on the practical applications of structural reliability theory rather than the theory itself. Consequently, probability theory is treated as a tool, and enough is given to show the novice reader how to calculate reliability. Some background in structural engineering and structural mechanics is assumed.

... read more

Topics: Reliability (statistics) (58%), Limit state design (54%)

884 Citations

••

Study of the sensitivity of coupled reaction systems to uncertainties in rate coefficients. I Theory

Robert I. Cukier^{1}, C. M. Fortuin^{1}, Kurt E. Shuler^{1}, A. G. Petschek +1 more•Institutions (1)

Abstract: A method has been developed to investigate the sensitivity of the solutions of large sets of coupled nonlinear rate equations to uncertainties in the rate coefficients. This method is based on varying all the rate coefficients simultaneously through the introduction of a parameter in such a way that the output concentrations become periodic functions of this parameter at any given time t. The concentrations of the chemical species are then Fourier analyzed at time t. We show via an application of Weyl's ergodic theorem that a subset of the Fourier coefficients is related to 〈∂ci/∂kl〉, the rate of change of the concentration of species i with respect to the rate constant for reaction l averaged over the uncertainties of all the other rate coefficients. Thus a large Fourier coefficient corresponds to a large sensitivity, and a small Fourier coefficient corresponds to a small sensitivity. The amount of numerical integration required to calculate these Fourier coefficients is considerably less than that requi...

... read more

Topics: Fourier analysis (60%), Fourier series (59%), Fourier amplitude sensitivity testing (59%) ... read more

877 Citations