scispace - formally typeset
Search or ask a question
Book

Structural Reliability Methods

Ove Ditlevsen, H. O. Madsen1
01 Jun 1996-
TL;DR: Partial Safety Factor Method Probabilistic Information Simple Reliability Index Geometricreliability Index Generalized Reliability index Transformation Sensitivity Analysis Monte Carlo Methods Load Combinations Statistical and Model Uncertainty Decision Philosophy Reliability of Existing Structures System Reliability Analysis.
Abstract: Partial Safety Factor Method Probabilistic Information Simple Reliability Index Geometric Reliability Index Generalized Reliability Index Transformation Sensitivity Analysis Monte Carlo Methods Load Combinations Statistical and Model Uncertainty Decision Philosophy Reliability of Existing Structures System Reliability Analysis Introduction to Process Descriptions.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, generalized polynomial chaos expansions (PCE) are used to build surrogate models that allow one to compute the Sobol' indices analytically as a post-processing of the PCE coefficients.

1,934 citations

Journal ArticleDOI
TL;DR: In this article, the sources and characters of uncertainties in engineering modeling for risk and reliability analyses are discussed, and they are generally categorized as either aleatory or epistemic, if the modeler sees a possibility to reduce them by gathering more data or by refining models.

1,835 citations


Cites background or methods from "Structural Reliability Methods"

  • ...A detailed discussion of the philosophy of this objectivity issue is given in Ditlevsen and Madsen [10]....

    [...]

  • ...This is the approach adopted in Ditlevsen [7] and further developed in Ditlevsen and Madsen [10]....

    [...]

Journal ArticleDOI
TL;DR: An iterative approach based on Monte Carlo Simulation and Kriging metamodel to assess the reliability of structures in a more efficient way and is shown to be very efficient as the probability of failure obtained with AK-MCS is very accurate and this, for only a small number of calls to the performance function.

1,234 citations

Journal ArticleDOI
TL;DR: Verification and validation of computational simulations are the primary methods for building and quantifying this confidence in modeling and simulation.
Abstract: Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, i.e., experimental data, is the issue.

735 citations

Journal ArticleDOI
TL;DR: A non-intrusive method that builds a sparse PC expansion and an adaptive regression-based algorithm is proposed for automatically detecting the significant coefficients of the PC expansion in a suitable polynomial chaos basis.

710 citations


Cites background from "Structural Reliability Methods"

  • ...crude Monte Carlo, FORM and importance sampling [40]) to the response surface (A....

    [...]

References
More filters
Book
01 Jun 1981
TL;DR: A number of new classes of life distributions arising naturally in reliability models are treated systematically and each provides a realistic probabilistic description of a physical property occurring in the reliability context, thus permitting more realistic modeling of commonly occurring reliability situations.
Abstract: : This is the first of two books on the statistical theory of reliability and life testing. The present book concentrates on probabilistic aspects of reliability theory, while the forthcoming book will focus on inferential aspects of reliability and life testing, applying the probabilistic tools developed in this volume. This book emphasizes the newer, research aspects of reliability theory. The concept of a coherent system serves as a unifying theme for much of the book. A number of new classes of life distributions arising naturally in reliability models are treated systematically: the increasing failure rate average, new better than used, decreasing mean residual life, and other classes of distributions. As the names would seem to indicate, each such class of life distributions provides a realistic probabilistic description of a physical property occurring in the reliability context. Also various types of positive dependence among random variables are considered, thus permitting more realistic modeling of commonly occurring reliability situations.

3,876 citations

Book
04 Aug 1975
TL;DR: This research attacked the mode confusion problem by developing a modeling framework called “model schizophrenia” to estimate the posterior probability of various modeled errors.
Abstract: Keywords: Probabiliste ; Methode statistique Reference Record created on 2004-09-07, modified on 2016-08-08

2,679 citations

Book
01 Jan 1970
TL;DR: The Manuel Reference Record is used as a source for statistical uncertainty in the area of probabilite estimation and its values can be modified for error-correcting purposes.
Abstract: Keywords: Statistique ; Probabilite ; Manuel Reference Record created on 2004-09-07, modified on 2016-08-08

2,336 citations