scispace - formally typeset
Search or ask a question
Journal ArticleDOI

L-moments-based uncertainty quantification for scarce samples including extremes

30 Jun 2021-Structural and Multidisciplinary Optimization (Springer Berlin Heidelberg)-Vol. 64, Iss: 2, pp 505-539
TL;DR: L-moment ratio diagram that uses higher order L-moments is adopted to choose the appropriate distribution, for uncertainty quantification and the probabilistic estimates obtained are found to be less sensitive to the extremes in the data, compared to the results obtained from the conventional moments approach.
Abstract: Sampling-based uncertainty quantification demands large data. Hence, when the available sample is scarce, it is customary to assume a distribution and estimate its moments from scarce data, to characterize the uncertainties. Nonetheless, inaccurate assumption about the distribution leads to flawed decisions. In addition, extremes, if present in the scarce data, are prone to be classified as outliers and neglected which leads to wrong estimation of the moments. Therefore, it is desirable to develop a method that is (i) distribution independent or allows distribution identification with scarce samples and (ii) accounts for the extremes in data and yet be insensitive or less sensitive to moments estimation. We propose using L-moments to develop a distribution-independent, robust moment estimation approach to characterize the uncertainty and propagate it through the system model. L-moment ratio diagram that uses higher order L-moments is adopted to choose the appropriate distribution, for uncertainty quantification. This allows for better characterization of the output distribution and the probabilistic estimates obtained using L-moments are found to be less sensitive to the extremes in the data, compared to the results obtained from the conventional moments approach. The efficacy of the proposed approach is demonstrated on conventional distributions covering all types of tails and several engineering examples. Engineering examples include a sheet metal manufacturing process, 7 variable speed reducer, and probabilistic fatigue life estimation.
References
More filters
BookDOI
01 Jan 1986
TL;DR: The Kernel Method for Multivariate Data: Three Important Methods and Density Estimation in Action.
Abstract: Introduction. Survey of Existing Methods. The Kernel Method for Univariate Data. The Kernel Method for Multivariate Data. Three Important Methods. Density Estimation in Action.

15,499 citations

Journal ArticleDOI
Jonathan R. M. Hosking1
TL;DR: The authors define L-moments as the expectations of certain linear combinations of order statistics, which can be defined for any random variable whose mean exists and form the basis of a general theory which covers the summarization and description of theoretical probability distributions.
Abstract: L-moments are expectations of certain linear combinations of order statistics. They can be defined for any random variable whose mean exists and form the basis of a general theory which covers the summarization and description of theoretical probability distributions, the summarization and description of observed data samples, estimation of parameters and quantiles of probability distributions, and hypothesis tests for probability distributions. The theory involves such established procedures as the use of order statistics and Gini's mean difference statistic, and gives rise to some promising innovations such as the measures of skewness and kurtosis and new methods of parameter estimation

2,668 citations

MonographDOI
TL;DR: In this paper, the authors present a regional L-moments algorithm for detecting homogeneous regions in a set of homogeneous data points and then select a frequency distribution for each region.
Abstract: Preface 1. Regional frequency analysis 2. L-moments 3. Screening the data 4. Identification of homogeneous regions 5. Choice of a frequency distribution 6. Estimation of the frequency distribution 7. Performance of the regional L-moment algorithm 8. Other topics 9. Examples Appendix References Index of notation.

2,329 citations

Journal ArticleDOI
TL;DR: In this article, Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions, such as Tukey's lambda, which may present problems in deriving their parameters by more conventional means.
Abstract: Distributions whose inverse forms are explicitly defined, such as Tukey's lambda, may present problems in deriving their parameters by more conventional means. Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions.

1,147 citations

01 Jan 1982
TL;DR: Methods of choosing histogram width and the smoothing parameter of kernel density estimators by use of data are studied and two closely related risk function estimators are given.
Abstract: Methods of choosing histogram width and the smoothing parameter of kernel density estimators by use of data are studied. They are based on estimators of risk functions corresponding to mean integrated squared error and the Kullback-Leibler information measure. Two closely related risk function estimators are given, one of Which is derived from cross-validation. In examples with simulated and real data the methods are applied to estimation of probability densities and the rate function of a time-depend- ent Poisson process.

801 citations