scispace - formally typeset
Search or ask a question
Author

François Deheeger

Bio: François Deheeger is an academic researcher. The author has contributed to research in topics: Importance sampling & Kriging. The author has an hindex of 3, co-authored 3 publications receiving 376 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors propose to use a Kriging surrogate for the performance function as a means to build a quasi-optimal importance sampling density, which can be applied to analytical and finite element reliability problems and proves efficient up to 100 basic random variables.

389 citations

Posted Content
TL;DR: A kriging surrogate of the performance function is proposed to use as a means to build a quasi-optimal importance sampling density and proves efficient up to 100 random variables.
Abstract: Structural reliability methods aim at computing the probability of failure of systems with respect to some prescribed performance functions. In modern engineering such functions usually resort to running an expensive-to-evaluate computational model (e.g. a finite element model). In this respect simulation methods, which may require $10^{3-6}$ runs cannot be used directly. Surrogate models such as quadratic response surfaces, polynomial chaos expansions or kriging (which are built from a limited number of runs of the original model) are then introduced as a substitute of the original model to cope with the computational cost. In practice it is almost impossible to quantify the error made by this substitution though. In this paper we propose to use a kriging surrogate of the performance function as a means to build a quasi-optimal importance sampling density. The probability of failure is eventually obtained as the product of an augmented probability computed by substituting the meta-model for the original performance function and a correction term which ensures that there is no bias in the estimation even if the meta-model is not fully accurate. The approach is applied to analytical and finite element reliability problems and proves efficient up to 100 random variables.

74 citations

Posted Content
TL;DR: The proposed alternative estimator takes advantage of the kriging meta-modeling and importance sampling techniques and is applied to a finite element based structural reliability analysis.
Abstract: In the field of structural reliability, the Monte-Carlo estimator is considered as the reference probability estimator. However, it is still untractable for real engineering cases since it requires a high number of runs of the model. In order to reduce the number of computer experiments, many other approaches known as reliability methods have been proposed. A certain approach consists in replacing the original experiment by a surrogate which is much faster to evaluate. Nevertheless, it is often difficult (or even impossible) to quantify the error made by this substitution. In this paper an alternative approach is developed. It takes advantage of the kriging meta-modeling and importance sampling techniques. The proposed alternative estimator is finally applied to a finite element based structural reliability analysis.

25 citations

Journal ArticleDOI
09 Sep 2022
TL;DR: This work derives a novel importance weighting algorithm which scales to large datasets by using a neural network to predict the instance weights, and appears to be the only one able to give relevant reweighting in a reasonable time for large dataset with up to two million data.
Abstract: Bias in datasets can be very detrimental for appropriate statistical estimation. In response to this problem, importance weighting methods have been developed to match any biased distribution to its corresponding target unbiased distribution. The seminal Kernel Mean Matching (KMM) method is, nowadays, still considered as state of the art in this research field. However, one of the main drawbacks of this method is the computational burden for large datasets. Building on previous works by Huang et al. (2007) and de Mathelin et al. (2021), we derive a novel importance weighting algorithm which scales to large datasets by using a neural network to predict the instance weights. We show, on multiple public datasets, under various sample biases, that our proposed approach drastically reduces the computational time on large dataset while maintaining similar sample bias correction performance compared to other importance weighting methods. The proposed approach appears to be the only one able to give relevant reweighting in a reasonable time for large dataset with up to two million data.

Cited by
More filters
Journal ArticleDOI
TL;DR: Results show that LIF and the new method proposed in this research are very efficient when dealing with nonlinear performance function, small probability, complicated limit state and engineering problems with high dimension.

268 citations

Journal ArticleDOI
TL;DR: PC-Kriging is derived as a new non-intrusive meta-modeling approach combining PCE and Kriging, which approximates the global behavior of the computational model whereas Kriged manages the local variability of the model output.
Abstract: Computer simulation has become the standard tool in many engineering fields for designing and optimizing systems, as well as for assessing their reliability. Optimization and uncertainty quantification problems typically require a large number of runs of the computational model at hand, which may not be feasible with high-fidelity models directly. Thus surrogate models (a.k.a metamodels) have been increasingly investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging are two popular non-intrusive metamodelling techniques. PCE surrogates the computational model with a series of orthonormal polynomials in the input variables where polynomials are chosen in coherency with the probability distributions of those input variables. A least-square minimization technique may be used to determine the coefficients of the PCE. On the other hand, Kriging assumes that the computer model behaves as a realization of a Gaussian random process whose parameters are estimated from the available computer runs, i.e. input vectors and response values. These two techniques have been developed more or less in parallel so far with little interaction between the researchers in the two fields. In this paper, PC-Kriging is derived as a new non-intrusive meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal polynomials (PCE) approximates the global behavior of the computational model whereas Kriging manages the local variability of the model output. An adaptive algorithm similar to the least angle regression algorithm determines the optimal sparse set of polynomials. PC-Kriging is validated on various benchmark analytical functions which are easy to sample for reference results. From the numerical investigations it is concluded that PC-Kriging performs better than or at least as good as the two distinct meta-modeling techniques. A larger gain in accuracy is obtained when the experimental design has a limited size, which is an asset when dealing with demanding computational models.

220 citations

Journal ArticleDOI
TL;DR: The modification allows overcoming an important limitation of the original AK-IS in that it provides the algorithm with the flexibility to deal with multiple failure regions characterized by complex, non-linear limit states.

205 citations

Journal ArticleDOI
TL;DR: A new structural reliability method based on the recently developed polynomial-chaos kriging (PC-kriging) approach coupled with an active learning algorithm known as adaptive kriged Monte Carlo simulation (AK-MCS) is developed.
Abstract: Structural reliability analysis aims at computing the probability of failure of systems whose performance may be assessed by using complex computational models (e.g., expensive-to-run finite-element models). A direct use of Monte Carlo simulation is not feasible in practice, unless a surrogate model (such as kriging, also known as Gaussian process modeling) is used. Such metamodels are often used in conjunction with adaptive experimental designs (i.e., design enrichment strategies), which allows one to iteratively increase the accuracy of the surrogate for the estimation of the failure probability while keeping low the overall number of runs of the costly original model. This paper develops a new structural reliability method based on the recently developed polynomial-chaos kriging (PC-kriging) approach coupled with an active learning algorithm known as adaptive kriging Monte Carlo simulation (AK-MCS). The problem is formulated in such a way that the computation of both small probabilities of fail...

198 citations