scispace - formally typeset
Search or ask a question

Showing papers on "Probability density function published in 2023"


Journal ArticleDOI
TL;DR: In this paper , a consistent seismic hazard and fragility framework considering combined capacity-demand uncertainties is proposed, in light of the probability density evolution method (PDEM), and a combined performance index (CPI) is defined as concerned physical variable in PDEM, through pushover static and time-history dynamic analyses.

22 citations


Journal ArticleDOI
TL;DR: In this paper , a stochastic seismic sequence model was established, and its generation method was derived based on the source-path-site mechanism, and the representative point sets of seismic parameters could be chosen based on generalized F-discrepancy and the correlation between the mainshock and aftershock parameters were determined by using Copula theory.
Abstract: A novel approach for nonlinear stochastic dynamic analysis is proposed and illustrated with nonlinear building structures subjected to mainshock–aftershock sequences. First, a stochastic seismic sequence model with stochastic parameters was established, and its generation method was derived based on the source–path–site mechanism. Then, the representative point sets of seismic parameters could be chosen based on generalized F-discrepancy, and the correlation between the mainshock and aftershock parameters could be determined by using Copula theory. Finally, the stochastic dynamic response was obtained by solving the probability density integral equation (PDIE). Furthermore, the first-passage dynamic reliability could be obtained by the direct probability integral method (DPIM) combined with the absorbing condition approach. This novel approach was used to obtain stochastic dynamic results for four structures subjected to stochastic seismic sequences, which were compared to those using Monte Carlo simulation (MCS) and probability density evolution method (PDEM) to demonstrate the proposed method’s correctness and efficiency. Additionally, the influence of aftershocks on nonlinear structures is explained from the perspective of probability for the first time.

20 citations


Journal ArticleDOI
TL;DR: In this paper , a two-stage distributionally robust optimization model (TSDRO) based on kernel density estimation (KDE) and Wasserstein metric is proposed to solve the coordination problem of robustness, economy, environmental protection and efficiency.

5 citations



Journal ArticleDOI
TL;DR: In this article , a hopscotch scan is used to estimate the PDF uncertainty on key LHC cross sections at 13 TeV obtained with the public NNPDF4.0 fitting code, while accounting for the likelihood distribution.
Abstract: In global QCD fits of parton distribution functions (PDFs), a large part of the estimated uncertainty on the PDFs originates from the choices of parametric functional forms and fitting methodology. We argue that these types of uncertainties can be underestimated with common PDF ensembles in high-stake measurements at the Large Hadron Collider and Tevatron. A fruitful approach to quantify these uncertainties is to view them as arising from sampling of allowed PDF solutions in a multidimensional parametric space. This approach applies powerful insights gained in recent statistical studies of large-scale population surveys and quasi-Monte Carlo integration methods. In particular, PDF fits may be affected by the big data paradox, which stipulates that more experimental data do not automatically raise the accuracy of PDFs---close attention to the data quality and sampling of possible PDF solutions is as essential. To test if the sampling of the PDF uncertainty of an experimental observable is truly representative of all acceptable solutions, we introduce a technique (``a hopscotch scan'') based on a combination of parameter scans and stochastic sampling. With this technique, we examine the PDF uncertainty on key LHC cross sections at 13 TeV obtained with the public NNPDF4.0 fitting code, while accounting for the likelihood distribution. We show that the uncertainties on the charm distribution at a large momentum fraction $x$ and gluon PDF at small $x$ are enlarged. In PDF ensembles obtained in the analytic minimization (Hessian) formalism, the tolerance on the PDF uncertainty must be based on sufficiently complete sampling of PDF functional forms and choices of the experiments.

4 citations


Journal ArticleDOI
TL;DR: In this article , a semi-analytical method using the radial basis function neural network (RBFNN) was proposed to attain the transient probability density distribution of the randomly excited Bouc-Wen system.

4 citations


Journal ArticleDOI
TL;DR: In this article , the authors presented a novel TDRA method for small failure probability based on point evolution kernel density (PKDE) and adaptive surrogate modeling (SLSM) to efficiently reduce the computational burden.

3 citations


Journal ArticleDOI
TL;DR: In this article , a bimodal generalization of the Gumbel distribution was proposed to model the hazard rate function and the mode, bimmodality, moment generating function and moments.
Abstract: The Gumbel model is a very popular statistical model due to its wide applicability for instance in the course of certain survival, environmental, financial or reliability studies. In this work, we have introduced a bimodal generalization of the Gumbel distribution thatcan be an alternative to model bimodal data. We derive the analytical shapes of the corresponding probability density function and thehazard rate function and provide graphical illustrations. Furthermore, We have discussed the properties of this density such as mode, bimodality, moment generating function and moments. Our results were verified using the Markov chain Monte Carlo simulation method. The maximum likelihood method is used for parameters estimation. Finally, we also carry out an application to real data that demonstrates the usefulness of the proposed distribution.

3 citations


Journal ArticleDOI
TL;DR: In this paper , a coupling model of the metro train-track-shield tunnel-foundation soil is established based on the probability density evolution method (PDEM) and stochastic field theory, and the results of ground surface vibration displacement under different metro train speeds are analyzed.

3 citations


Journal ArticleDOI
TL;DR: In this paper , the accuracy and efficiency of two methods for stochastic analysis, the probability density evolution method (PDEM) and the Monte Carlo simulation (MCS), are compared in terms of how well they reflect the physical properties of the system.

3 citations


Journal ArticleDOI
TL;DR: In this paper , the simulation results of 360 T/Y-joints reinforced with collar plates have been used to propose theoretical probability distribution models for the ultimate capacity at ambient and fire conditions.

Journal ArticleDOI
TL;DR: In this paper , a probabilistic model for the threshold stress intensity factor range is developed, which is a critical parameter in infinite fatigue life design under material flaws, and the model is based on the proposed concept of probability of propagation in the probablistic framework, allowing for deriving the probability density function of the threshold intensity factor.

Journal ArticleDOI
TL;DR: In this article , a new probability density model of stress amplitude is proposed, which is composed of an exponential distribution and a two-parameter Weibull distribution, and the connection between the first three order moments of rain-flow amplitude distributions and the spectral parameters is found.

Journal ArticleDOI
13 Jan 2023-Sensors
TL;DR: In this article , the authors proposed a new privatization mechanism based on a naive theory of additive perturbations on a probability using wavelets, such as a noise perturbs the signal of a digital image sensor.
Abstract: A naive theory of additive perturbations on a continuous probability distribution is presented. We propose a new privatization mechanism based on a naive theory of a perturbation on a probability using wavelets, such as a noise perturbs the signal of a digital image sensor. The cumulative wavelet integral function is defined and builds up the perturbations with the help of this function. We show that an arbitrary distribution function additively perturbed is still a distribution function, which can be seen as a privatized distribution, with the privatization mechanism being a wavelet function. It is shown that an arbitrary cumulative distribution function added to such an additive perturbation is still a cumulative distribution function. Thus, we offer a mathematical method for choosing a suitable probability distribution to data by starting from some guessed initial distribution. The areas of artificial intelligence and machine learning are constantly in need of data fitting techniques, closely related to sensors. The proposed privatization mechanism is therefore a contribution to increasing the scope of existing techniques.

Journal ArticleDOI
TL;DR: In this paper , a novel RC-based method is proposed to determine ACLR in a convenient way, and a comprehensive analysis of the corresponding measurement uncertainty is presented, where the probability density function and associated statistics (e.g., expectation and variance) of the measured ACLR are derived.
Abstract: Thanks to the appealing properties of the reverberation chamber (RC), it has become a standard solution to over-the-air (OTA) tests [e.g., total radiated power (TPR) and total isotropic sensitivity] according to the Cellular Telecommunications and Internet Association. However, the adjacent channel leakage power ratio (ACLR) has been overlooked in existing researches. In this work, a novel RC-based method is proposed to determine ACLR in a convenient way, and a comprehensive analysis of the corresponding measurement uncertainty is presented. The probability density function (PDF) and associated statistics (e.g., expectation and variance) of the measured ACLR are derived. Based on the statistics, an unbiased estimator of ACLR is further proposed. By utilizing the theoretical uncertainty model of the TPR, an approximated uncertainty model of ACLR is also established, providing a convenient way to access the relative uncertainty when the PDF and associated statistics of the measurand are unavailable. Extensive simulations and measurements are performed to validate the proposed method and uncertainty models. Good agreements are observed for various situations. The findings of this work allow not only simple and convenient RC-based ACLR measurements but also rigorous and efficient uncertainty analyses.

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed an RUL prediction framework based on performance evaluation and geometric fractional Lévy stable motion (GFLSM) with adaptive nonlinear drift, where the early fault identification of degradation process is realized by setting a threshold for the constructed monotonic health indicator (HI).

Journal ArticleDOI
TL;DR: In this paper , the density function of normal distribution was used to estimate the probability at a particular point in the case of lattice integer-valued random variables, and the third moment condition of the local limit theorem was relaxed.
Abstract: Abstract One of the most fundamental probabilities is the probability at a particular point. The local limit theorem is the well-known theorem that estimates this probability. In this paper, we estimate this probability by the density function of normal distribution in the case of lattice integer-valued random variables. Our technique is the characteristic function method. We complete to relax the third moment condition of Siripraparat and Neammanee (J. Inequal. Appl. 2021:57, 2021) and the references therein and also obtain explicit constants of the error bound.

Journal ArticleDOI
TL;DR: In this paper , a new probability distribution model (normal PDF) was tested to implement wind speed at several wind locations in Jordan, and the results show high compatibility between this model and the wind resources in Jordan.
Abstract: Estimating wind energy at a specific wind site depends on how well the real wind data in that area can be represented using an appropriate distribution function. In fact, wind sites differ in the extent to which their wind data can be represented from one region to another, despite the widespread use of the Weibull function in representing the wind speed in various wind locations in the world. In this study, a new probability distribution model (normal PDF) was tested to implement wind speed at several wind locations in Jordan. The results show high compatibility between this model and the wind resources in Jordan. Therefore, this model was used to estimate the values of the wind energy and the extracted energy of wind turbines compared to those obtained by the Weibull PDF. Several artificial intelligence techniques were used (GA, BFOA, SA, and a neuro-fuzzy method) to estimate and predict the parameters of both the normal and Weibull PDFs that were reflected in conjunction with the actual observed data of wind probabilities. Afterward, the goodness of fit was decided with the aid of two performance indicators (RMSE and MAE). Surprisingly, in this study, the normal probability distribution function (PDF) outstripped the Weibull PDF, and interestingly, BFOA and SA were the most accurate methods. In the last stage, machine learning was used to classify and predict the error level between the actual probability and the estimated probability based on the trained and tested data of the PDF parameters. The proposed novel methodology aims to predict the most accurate parameters, as the subsequent energy calculation phases of wind depend on the proper selection of these parameters. Hence, 24 classifier algorithms were used in this study. The medium tree classifier shows the best performance from the accuracy and training time points of view, while the ensemble-boosted trees classifier shows poor performance regarding providing correct predictions.

Journal ArticleDOI
TL;DR: In this article , the transition density function of a diffusion local time process (X) can be expressed in the closed form by means of a convolution integral involving a new special function and a modified Bessel function of the second kind.
Abstract: dXt = (bXt+c)I(Xt >0) dt+ √ 2aXt dBt I(Xt =0) dt = 1 μ d` 0 t (X) where b ∈ IR and 0 < c < a are given and fixed, B is a standard Brownian motion, and `(X) is a diffusion local time process of X at 0 , and (ii) the transition density function of X can be expressed in the closed form by means of a convolution integral involving a new special function and a modified Bessel function of the second kind. The new special function embodies the stickiness of X entirely and reduces to the Mittag-Leffler function when b = 0 . We determine a (sticky) boundary condition at zero that characterises the transition density function of X as a unique solution to the Kolmogorov forward/backward equation of X . Letting μ ↓ 0 (absorption) and μ ↑ ∞ (instantaneous reflection) the closed-form expression for the transition density function of X reduces to the ones found by Feller [6] and Molchanov [14] respectively. The results derived for sticky Feller diffusions translate over to yield closed-form expressions for the transition density functions of (a) sticky Cox-Ingersoll-Ross processes and (b) sticky reflecting Vasicek processes that can be used to model slowly reflecting interest rates.


Journal ArticleDOI
TL;DR: In this paper , a neural network is used to learn the drift and diffusion terms of the stochastic differential equation and a new loss function containing the Hellinger distance between the observation data and the learned stationary probability density function is introduced.

Journal ArticleDOI
23 Feb 2023-Symmetry
TL;DR: The half-logistic modified Kies exponential (HLMKEx) distribution as mentioned in this paper is a three-parameter model that is introduced in the current work to expand the modified kies exponential distribution and improve its flexibility in modeling real-world data.
Abstract: The half-logistic modified Kies exponential (HLMKEx) distribution is a novel three-parameter model that is introduced in the current work to expand the modified Kies exponential distribution and improve its flexibility in modeling real-world data. Due to its versatility, the density function of the HLMKEx distribution offers symmetrical, asymmetrical, unimodal, and reversed-J-shaped, as well as increasing, reversed-J shaped, and upside-down hazard rate forms. An infinite linear representation can be used to represent the HLMKEx density. The HLMKEx model’s fundamental mathematical features are obtained, such as the quantile function, moments, incomplete moments, and moments of residuals. Additionally, some measures of uncertainty as well as stochastic ordering are derived. To estimate its parameters, eight estimation methods are used. With the use of detailed simulation data, we compare the performance of each estimating technique and obtain partial and total ranks for the accuracy measures of absolute bias, mean squared error, and mean absolute relative error. The simulation results demonstrate that, in contrast to other competing distributions, the proposed distribution can actually fit the data more accurately. Two actual data sets are investigated in the field of engineering to demonstrate the adaptability and application of the suggested distribution. The findings demonstrate that, in contrast to other competing distributions, the provided distribution can actually fit the data more accurately.

Journal ArticleDOI
TL;DR: In this paper , a Bayesian probability inference has been applied to the frequency modulated continuous wave reflectometry and the Far-infrared laser interferometer diagnostic systems on HL-2A tokamak, offering the integrated data analysis for electron density profile reconstruction.
Abstract: In fusion research, the diagnostic data are obtained from different diagnostic systems, which are relatively independent (in terms of the response function, noise, calibration, etc…). The consequence is that many measurements providing the same physical quantity could provide different results. In this work, the Bayesian probability inference has been applied to the frequency modulated continuous wave reflectometry and the Far-infrared laser interferometer diagnostic systems on HL-2A tokamak, offering the integrated data analysis (IDA) for electron density profile reconstruction. With the implementation, it is demonstrated that more comprehensive inference could be delivered from IDA compared to the traditional individual data analysis technique. The data analysis program based on the Bayesian inference model has been developed to reconstruct the two-dimensional electron density profile, which permits to be further implementation of the HL-2A/2M IDA framework in the near future.

Journal ArticleDOI
TL;DR: In this paper , the authors proposed a new distribution called the Exponentiated Power Lindley-Logarithmic Distribution for modeling real life data, motivated by the exponential power Lindley distribution.
Abstract: This article proposes a new distribution call the Exponentiated Power Lindley-Logarithmic Distribution for modeling real life data. The distribution is motivated by the Exponentiated Power Lindley distribution. The quantile function is derived and the Maximum likelihood estimates of the parameters are also derived. The distribution performed better in simulation study than the competing distribution. The distribution can model real life biomedical phenomena and agricultural events.

Journal ArticleDOI
TL;DR: In this article , the existence and eligibility of the globally evolving-based generalized density evolution equation (GE-GDEE) for multi-dimensional linear fractional differential systems subject to Gaussian white noise is established.

Journal ArticleDOI
TL;DR: In this paper , the path integral solutions for a general n-dimensional stochastic differential equations (SDEs) with α-stable Lévy noise are derived and verified.

Journal ArticleDOI
01 Jan 2023
TL;DR: In this paper , an analytical approach to the coverage probability analysis of UAV-assisted cellular networks with imperfect beam alignment has been proposed, where all users are distributed according to Poisson cluster process (PCP) around base stations, in particular, Thomas Cluster Process (TCP).
Abstract: With the rapid development of emerging 5G and beyond (B5G), Unmanned Aerial Vehicles (UAVs) are increasingly important to improve the performance of dense cellular networks. As a conventional metric, coverage probability has been widely studied in communication systems due to the increasing density of users and complexity of the heterogeneous environment. In recent years, stochastic geometry has attracted more attention as a mathematical tool for modeling mobile network systems. In this paper, an analytical approach to the coverage probability analysis of UAV-assisted cellular networks with imperfect beam alignment has been proposed. An assumption was considered that all users are distributed according to Poisson Cluster Process (PCP) around base stations, in particular, Thomas Cluster Process (TCP). Using this model, the impact of beam alignment errors on the coverage probability was investigated. Initially, the Probability Density Function (PDF) of directional antenna gain between the user and its serving base station was obtained. Then, association probability with each tier was achieved. A tractable expression was derived for coverage probability in both Line-of-Sight (LoS) and Non-Line-of-Sight (NLoS) condition links. Numerical results demonstrated that at low UAVs altitude, beam alignment errors significantly degrade coverage performance. Moreover, for a small cluster size, alignment errors do not necessarily affect the coverage performance.


Journal ArticleDOI
TL;DR: In this paper , a failure probability-based global sensitivity index was proposed to estimate the effect of uncertain inputs on the time-variant reliability by comparing the difference between the unconditional probability density function of input variables and the conditional probability density functions in failure state of the input variables.

Journal ArticleDOI
01 Jun 2023-Entropy
TL;DR: In this article, a non-local generalization of probability is proposed using the Luchko general fractional calculus (GFC) and its extension in the form of the multi-kernel GFC of arbitrary order.
Abstract: Using the Luchko’s general fractional calculus (GFC) and its extension in the form of the multi-kernel general fractional calculus of arbitrary order (GFC of AO), a nonlocal generalization of probability is suggested. The nonlocal and general fractional (CF) extensions of probability density functions (PDFs), cumulative distribution functions (CDFs) and probability are defined and its properties are described. Examples of general nonlocal probability distributions of AO are considered. An application of the multi-kernel GFC allows us to consider a wider class of operator kernels and a wider class of nonlocality in the probability theory.