scispace - formally typeset
Search or ask a question

Showing papers on "Probability density function published in 2007"


Journal ArticleDOI
TL;DR: The coupled climate models used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change are evaluated in this paper, focusing on 12 regions of Australia for the daily simulation of precipitation, minimum temperature, and maximum temperature.
Abstract: The coupled climate models used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change are evaluated The evaluation is focused on 12 regions of Australia for the daily simulation of precipitation, minimum temperature, and maximum temperature The evaluation is based on probability density functions and a simple quantitative measure of how well each climate model can capture the observed probability density functions for each variable and each region is introduced Across all three variables, the coupled climate models perform better than expected Precipitation is simulated reasonably by most and very well by a small number of models, although the problem with excessive drizzle is apparent in most models Averaged over Australia, 3 of the 14 climate models capture more than 80% of the observed probability density functions for precipitation Minimum temperature is simulated well, with 10 of the 13 climate models capturing more than 80% of the observed probability densit

614 citations


Journal ArticleDOI
TL;DR: A representation space is introduced, to be called the complexity-entropy causality plane, which contains suitable functionals of the pertinent probability distribution, namely, the entropy of the system and an appropriate statistical complexity measure, respectively.
Abstract: Chaotic systems share with stochastic processes several properties that make them almost undistinguishable. In this communication we introduce a representation space, to be called the complexity-entropy causality plane. Its horizontal and vertical axis are suitable functionals of the pertinent probability distribution, namely, the entropy of the system and an appropriate statistical complexity measure, respectively. These two functionals are evaluated using the Bandt-Pompe recipe to assign a probability distribution function to the time series generated by the system. Several well-known model-generated time series, usually regarded as being of either stochastic or chaotic nature, are analyzed so as to illustrate the approach. The main achievement of this communication is the possibility of clearly distinguishing between them in our representation space, something that is rather difficult otherwise.

516 citations


Journal ArticleDOI
TL;DR: This letter analyzes the performance of cooperative diversity wireless networks using amplify-and-forward relaying over independent, non-identical, Nakagami-m fading channels and shows that the derived error rate and outage probability are tight lower bounds particularly at medium and high SNR.
Abstract: This letter analyzes the performance of cooperative diversity wireless networks using amplify-and-forward relaying over independent, non-identical, Nakagami-m fading channels. The error rate and the outage probability are determined using the moment generating function (MGF) of the total signal-to-noise-ratio (SNR) at the destination. Since it is hard to find a closed form for the probability density function (PDF) of the total SNR, we use an approximate value instead. We first derive the PDF and the MGF of the approximate value of the total SNR. Then, the MGF is used to determine the error rate and the outage probability. We also use simulation to verify the analytical results. Results show that the derived error rate and outage probability are tight lower bounds particularly at medium and high SNR

513 citations


Journal ArticleDOI
TL;DR: A simple, novel, and general method for approximating the sum of independent or arbitrarily correlated lognormal random variables (RV) by a single logn formalism RV without the extremely precise numerical computations at a large number of points that were required by the previously proposed methods.
Abstract: A simple, novel, and general method is presented in this paper for approximating the sum of independent or arbitrarily correlated lognormal random variables (RV) by a single lognormal RV. The method is also shown to be applicable for approximating the sum of lognormal-Rice and Suzuki RVs by a single lognormal RV. A sum consisting of a mixture of the above distributions can also be easily handled. The method uses the moment generating function (MGF) as a tool in the approximation and does so without the extremely precise numerical computations at a large number of points that were required by the previously proposed methods in the literature. Unlike popular approximation methods such as the Fenton-Wilkinson method and the Schwartz-Yeh method, which have their own respective short-comings, the proposed method provides the parametric flexibility to accurately approximate different portions of the lognormal sum distribution. The accuracy of the method is measured both visually, as has been done in the literature, as well as quantitatively, using curve-fitting metrics. An upper bound on the sensitivity of the method is also provided.

356 citations


Journal ArticleDOI
TL;DR: The object of this paper is to illustrate the utility of the data-driven approach to damage identification by means of a number of case studies.
Abstract: In broad terms, there are two approaches to damage identification Model-driven methods establish a high-fidelity physical model of the structure, usually by finite element analysis, and then establish a comparison metric between the model and the measured data from the real structure If the model is for a system or structure in normal (ie undamaged) condition, any departures indicate that the structure has deviated from normal condition and damage is inferred Data-driven approaches also establish a model, but this is usually a statistical representation of the system, eg a probability density function of the normal condition Departures from normality are then signalled by measured data appearing in regions of very low density The algorithms that have been developed over the years for data-driven approaches are mainly drawn from the discipline of pattern recognition, or more broadly, machine learning The object of this paper is to illustrate the utility of the data-driven approach to damage identification by means of a number of case studies

342 citations


Journal IssueDOI
TL;DR: These results are applied to analyze two efficient algorithms for sampling from a logconcave distribution in n dimensions, with no assumptions on the local smoothness of the density function.
Abstract: The class of logconcave functions in Rn is a common generalization of Gaussians and of indicator functions of convex sets. Motivated by the problem of sampling from logconcave density functions, we study their geometry and introduce a technique for “smoothing” them out. These results are applied to analyze two efficient algorithms for sampling from a logconcave distribution in n dimensions, with no assumptions on the local smoothness of the density function. Both algorithms, the ball walk and the hit-and-run walk, use a random walk (Markov chain) to generate a random point. After appropriate preprocessing, they produce a point from approximately the right distribution in time Oa(n4) and in amortized time Oa(n3) if n or more sample points are needed (where the asterisk indicates that dependence on the error parameter and factors of log n are not shown). These bounds match previous bounds for the special case of sampling from the uniform distribution over a convex body.© 2006 Wiley Periodicals, Inc. Random Struct. Alg., 2007 Part of this work was done while the second author was visiting Microsoft Research in Redmond, WA, and when the first author was visiting the Newton Institute in Cambridge, UK.

311 citations


Journal ArticleDOI
TL;DR: In this paper, the idea of equivalent extreme-value event and a new approach to evaluate the structural system reliability are elaborated, and the proposed approach is discussed in detail on how to construct the equivalent extreme value event and then implement the procedure numerically.

309 citations


Journal ArticleDOI
TL;DR: Experimental results show that an index such as this presents some desirable features that resemble those from an ideal image quality function, constituting a suitable quality index for natural images, and it is shown that the new measure is well correlated with classical reference metrics such as the peak signal-to-noise ratio.
Abstract: We describe an innovative methodology for determining the quality of digital images. The method is based on measuring the variance of the expected entropy of a given image upon a set of predefined directions. Entropy can be calculated on a local basis by using a spatial/spatial-frequency distribution as an approximation for a probability density function. The generalized Renyi entropy and the normalized pseudo-Wigner distribution (PWD) have been selected for this purpose. As a consequence, a pixel-by-pixel entropy value can be calculated, and therefore entropy histograms can be generated as well. The variance of the expected entropy is measured as a function of the directionality, and it has been taken as an anisotropy indicator. For this purpose, directional selectivity can be attained by using an oriented 1-D PWD implementation. Our main purpose is to show how such an anisotropy measure can be used as a metric to assess both the fidelity and quality of images. Experimental results show that an index such as this presents some desirable features that resemble those from an ideal image quality function, constituting a suitable quality index for natural images. Namely, in-focus, noise-free natural images have shown a maximum of this metric in comparison with other degraded, blurred, or noisy versions. This result provides a way of identifying in-focus, noise-free images from other degraded versions, allowing an automatic and nonreference classification of images according to their relative quality. It is also shown that the new measure is well correlated with classical reference metrics such as the peak signal-to-noise ratio.

303 citations


Journal ArticleDOI
TL;DR: The new expressions are used to prove that MIMO-MRC achieves the maximum available spatial diversity order, and to demonstrate the effect of spatial correlation.
Abstract: We consider multiple-input multiple-output (MIMO) transmit beamforming systems with maximum ratio combining (MRC) receivers The operating environment is Rayleigh fading with both transmit and receive spatial correlation We present exact expressions for the probability density function (pdf) of the output signal-to-noise ratio, as well as the system outage probability The results are based on explicit closed-form expressions which we derive for the pdf and cumulative distribution function of the maximum eigenvalue of double-correlated complex Wishart matrices For systems with two antennas at either the transmitter or the receiver, we also derive exact closed-form expressions for the symbol-error rate The new expressions are used to prove that MIMO-MRC achieves the maximum available spatial diversity order, and to demonstrate the effect of spatial correlation The analysis is validated through comparison with Monte Carlo simulations

297 citations


Journal ArticleDOI
TL;DR: A Bayesian probabilistic inferential framework, which provides a natural means for incorporating both errors and prior information about the source, is presented and the inverse source determination method is validated against real data sets acquired in a highly disturbed flow field in an urban environment.

266 citations


Journal ArticleDOI
TL;DR: An integrated platform for multi-sensor equipment diagnosis and prognosis based on hidden semi-Markov model (HSMM), which shows that the increase of correct diagnostic rate is indeed very promising and the equipment prognosis can be implemented in the same integrated framework.

Journal ArticleDOI
Jianbing Chen1, Jie Li1
TL;DR: In this article, a new approach for evaluation of the extreme value distribution and dynamic reliability assessment of nonlinear structures with uncertain parameters is proposed based on the thoughts of the newly developed probability density evolution method, which is capable of capturing the instantaneous probability density function (PDF) and its evolution of the responses of the nonlinear stochastic structures.

Journal ArticleDOI
TL;DR: A data-driven method for identifying the direction of propagation of disturbances using historical process data using the application of transfer entropy, a method based on the conditional probability density functions that measures directionality of variation.
Abstract: In continuous chemical processes, variations of process variables usually travel along propagation paths in the direction of the control path and process flow. This paper describes a data-driven method for identifying the direction of propagation of disturbances using historical process data. The novel concept is the application of transfer entropy, a method based on the conditional probability density functions that measures directionality of variation. It is sensitive to directionality even in the absence of an observable time delay. Its performance is studied in detail and default settings for the parameters in the algorithm are derived so that it can be applied in a large scale setting. Two industrial case studies demonstrate the method

Journal ArticleDOI
TL;DR: This work considers the problem of scheduling under uncertainty where the uncertain problem parameters can be described by a known probability distribution function and introduces a small number of auxiliary variables and additional constraints into the original MILP problem, generating a deterministic robust counterpart problem which provides the optimal/feasible solution.

Journal ArticleDOI
TL;DR: In this article, an efficient method for uncertainty analysis of flow in random porous media is explored, on the basis of combination of Karhunen-Loeve expansion and probabilistic collocation method (PCM).
Abstract: [1] An efficient method for uncertainty analysis of flow in random porous media is explored in this study, on the basis of combination of Karhunen-Loeve expansion and probabilistic collocation method (PCM) The random log transformed hydraulic conductivity field is represented by the Karhunen-Loeve expansion and the hydraulic head is expressed by the polynomial chaos expansion Probabilistic collocation method is used to determine the coefficients of the polynomial chaos expansion by solving for the hydraulic head fields for different sets of collocation points The procedure is straightforward and analogous to the Monte Carlo method, but the number of simulations required in PCM is significantly reduced Steady state flows in saturated random porous media are simulated with the probabilistic collocation method, and comparisons are made with other stochastic methods: Monte Carlo method, the traditional polynomial chaos expansion (PCE) approach based on Galerkin scheme, and the moment-equation approach based on Karhunen-Loeve expansion (KLME) This study reveals that PCM and KLME are more efficient than the Galerkin PCE approach While the computational efforts are greatly reduced compared to the direct sampling Monte Carlo method, the PCM and KLME approaches are able to accurately estimate the statistical moments and probability density function of the hydraulic head

Proceedings ArticleDOI
17 Jun 2007
TL;DR: A "spherical" version of the Fisher-Rao metric is proposed that provides closed-form expressions for geodesies and distances, and allows fast computation of sample statistics and an application in planar shape classification is presented.
Abstract: Applications in computer vision involve statistically analyzing an important class of constrained, non-negative functions, including probability density functions (in texture analysis), dynamic time-warping functions (in activity analysis), and re-parametrization or non-rigid registration functions (in shape analysis of curves). For this one needs to impose a Riemannian structure on the spaces formed by these functions. We propose a "spherical" version of the Fisher-Rao metric that provides closed-form expressions for geodesies and distances, and allows fast computation of sample statistics. To demonstrate this approach, we present an application in planar shape classification.

Journal ArticleDOI
TL;DR: The algorithms underlying various GRNGs are described, their computational requirements are compared, and the quality of the random numbers are examined with emphasis on the behaviour in the tail region of the Gaussian probability density function.
Abstract: Rapid generation of high quality Gaussian random numbers is a key capability for simulations across a wide range of disciplines. Advances in computing have brought the power to conduct simulations with very large numbers of random numbers and with it, the challenge of meeting increasingly stringent requirements on the quality of Gaussian random number generators (GRNG). This article describes the algorithms underlying various GRNGs, compares their computational requirements, and examines the quality of the random numbers with emphasis on the behaviour in the tail region of the Gaussian probability density function.

Journal ArticleDOI
TL;DR: In this paper, it is shown that, almost surely, the e.i.d. of the eigenvalues of 1NR"nR"n^* converges in distribution to a nonrandom probability distribution function.

Journal ArticleDOI
TL;DR: The exact generating function of Q at large tau and the large deviation function has a symmetry satisfying the steady-state fluctuation theorem without any quantum corrections and is non-Gaussian with clear exponential tails.
Abstract: We consider steady-state heat conduction across a quantum harmonic chain connected to reservoirs modeled by infinite collection of oscillators. The heat, Q, flowing across the oscillator in a time interval tau is a stochastic variable and we study the probability distribution function P(Q). We compute the exact generating function of Q at large tau and the large deviation function. The generating function has a symmetry satisfying the steady-state fluctuation theorem without any quantum corrections. The distribution P(Q) is non-Gaussian with clear exponential tails. The effect of finite tau and nonlinearity is considered in the classical limit through Langevin simulations. We also obtain the prediction of quantum heat current fluctuations at low temperatures in clean wires.


Journal ArticleDOI
TL;DR: In this article, a localized spectral analysis based on wavelet bases rather than on periodic-ones has been applied to decompose the instantaneous clearness index signal into a set of orthonormal subsignals.

Journal ArticleDOI
TL;DR: The core of the methodology is a novel concept of “probabilistically constrained rectangle”, which permits effective pruning/validation of nonqualifying/qualifying data and a new index structure called the U-tree for minimizing the query overhead.
Abstract: In an uncertain database, every object o is associated with a probability density function, which describes the likelihood that o appears at each position in a multidimensional workspace. This article studies two types of range retrieval fundamental to many analytical tasks. Specifically, a nonfuzzy query returns all the objects that appear in a search region rq with at least a certain probability tq. On the other hand, given an uncertain object q, fuzzy search retrieves the set of objects that are within distance eq from q with no less than probability tq. The core of our methodology is a novel concept of “probabilistically constrained rectangle”, which permits effective pruning/validation of nonqualifying/qualifying data. We develop a new index structure called the U-tree for minimizing the query overhead. Our algorithmic findings are accompanied with a thorough theoretical analysis, which reveals valuable insight into the problem characteristics, and mathematically confirms the efficiency of our solutions. We verify the effectiveness of the proposed techniques with extensive experiments.

Journal ArticleDOI
TL;DR: The fundamental solution for the Cauchy problem is interpreted as a probability density of a self-similar non-Markovian stochastic process related to a phenomenon of sub-diffusion (the variance grows in time sub-linearly).

Proceedings ArticleDOI
01 Jul 2007
TL;DR: A method is proposed for producing the complete predictive probability density function (PDF) based on kernel density estimation techniques and the preliminary results show that this method levels with state of the art one while being fast and producing thecomplete PDF.
Abstract: Wind power forecasting tools have been developed for some time. The majority of such tools usually provides single-valued (spot) predictions. Such predictions limits the use of tools for decision-making under uncertainty. In this paper we propose a method for producing the complete predictive probability density function (PDF). The method is based on kernel density estimation techniques. The preliminary results show that this method levels with state of the art one while being fast and producing the complete PDF. The results were obtained through real data from three French wind farms.

Journal ArticleDOI
TL;DR: In this article, the authors study nonparametric maximum likelihood estimation of a log-concave probability density and its distribution and hazard function, and show that the rate of convergence with respect to supremum norm on a compact interval for the density and hazard rate estimator is at least
Abstract: We study nonparametric maximum likelihood estimation of a log-concave probability density and its distribution and hazard function. Some general properties of these estimators are derived from two characterizations. It is shown that the rate of convergence with respect to supremum norm on a compact interval for the density and hazard rate estimator is at least $(\log(n)/n)^{1/3}$ and typically $(\log(n)/n)^{2/5}$, whereas the difference between the empirical and estimated distribution function vanishes with rate $o_{\mathrm{p}}(n^{-1/2})$ under certain regularity assumptions.

Journal ArticleDOI
TL;DR: In this article, the skeleton formalism is generalized to 3D density fields and a numerical method for computing a local approximation of the skeleton is presented and validated on Gaussian random fields.
Abstract: The skeleton formalism, which aims at extracting and quantifying the filamentary structure of our Universe, is generalized to 3D density fields. A numerical method for computing a local approximation of the skeleton is presented and validated here on Gaussian random fields. It involves solving equation (H∇p x Vp) = 0, where Vp and H are the gradient and Hessian matrix of the field. This method traces well the filamentary structure in 3D fields such as those produced by numerical simulations of the dark matter distribution on large scales, and is insensitive to monotonic biasing. Two of its characteristics, namely its length and differential length, are analysed for Gaussian random fields. Its differential length per unit normalized density contrast scales like the probability distribution function of the underlying density contrast times the total length times a quadratic Edgeworth correction involving the square of the spectral parameter. The total length-scales like the inverse square smoothing length, with a scaling factor given by 0.21 (5.28 + n) where n is the power index of the underlying field. This dependency implies that the total length can be used to constrain the shape of the underlying power spectrum, hence the cosmology. Possible applications of the skeleton to galaxy formation and cosmology are discussed. As an illustration, the orientation of the spin of dark haloes and the orientation of the flow near the skeleton is computed for cosmological dark matter simulations. The flow is laminar along the filaments, while spins of dark haloes within 500 kpc of the skeleton are preferentially orthogonal to the direction of the flow at a level of 25 per cent.

Journal ArticleDOI
Jie Li1, Jianbing Chen1
TL;DR: In this paper, a strategy of determining representative point sets through the number theoretical method (NTM) in analysis of nonlinear stochastic structures is proposed, which is applicable to general nonlinear structures involving random parameters, is capable of capturing instantaneous probability density function.
Abstract: A strategy of determining representative point sets through the number theoretical method (NTM) in analysis of nonlinear stochastic structures is proposed. The newly developed probability density evolution method, applicable to general nonlinear structures involving random parameters, is capable of capturing instantaneous probability density function. In the present paper, the NTM is employed to pick out the representative point sets in a hypercube, i.e., the multi-dimensional random parameters space. Further, a hyper-ball is imposed on the point sets to greatly reduce the number of the finally selected points. The accuracy of the proposed method is ensured in that he error estimate is proved. Numerical examples are studied to verify and validate the proposed method. The investigations indicate that the proposed method is of fair accuracy and efficiency, with the computational efforts of a problem involving multiple random parameters reduced to the level of that involving only one single random parameter.

Journal ArticleDOI
TL;DR: Given a set of observations, a nonparametric estimate of the underlying density function is constructed, and subsets of points with high density are formed through suitable manipulation of the associated Delaunay triangulation.
Abstract: Although Hartigan (1975) had already put forward the idea of connecting identification of subpopulations with regions with high density of the underlying probability distribution, the actual development of methods for cluster analysis has largely shifted towards other directions, for computational convenience. Current computational resources allow us to reconsider this formulation and to develop clustering techniques directly in order to identify local modes of the density. Given a set of observations, a nonparametric estimate of the underlying density function is constructed, and subsets of points with high density are formed through suitable manipulation of the associated Delaunay triangulation. The method is illustrated with some numerical examples.

Journal ArticleDOI
P. Reegen1
TL;DR: SigSpec as discussed by the authors is based on an analytical solution of the probability that a DFT peak of a given amplitude does not arise from white noise in a non-equally spaced data set.
Abstract: Identifying frequencies with low signal-to-noise ratios in time series of stellar photometry and spectroscopy, and measuring their amplitude ratios and peak widths accurately, are critical goals for asteroseismology. These are also challenges for time series with gaps or whose data are not sampled at a constant rate, even with modern Discrete Fourier Transform (DFT) software. Also the False-Alarm Probability introduced by Lomb and Scargle is an approximation which becomes less reliable in time series with longer data gaps. A rigorous statistical treatment of how to determine the significance of a peak in a DFT, called SigSpec, is presented here. SigSpec is based on an analytical solution of the probability that a DFT peak of a given amplitude does not arise from white noise in a non-equally spaced data set. The underlying Probability Density Function (PDF) of the amplitude spectrum generated by white noise can be derived explicitly if both frequency and phase are incorporated into the solution. In this paper, I define and evaluate an unbiased statistical estimator, the "spectral significance", which depends on frequency, amplitude, and phase in the DFT, and which takes into account the time-domain sampling. I also compare this estimator to results from other well established techniques and demonstrate the effectiveness of SigSpec with a few examples of ground- and space-based photometric data, illustratring how SigSpec deals with the effects of noise and time-domain sampling in determining significant frequencies.

Journal ArticleDOI
TL;DR: It is shown explicitly that all the observed quantities depend both on the threshold value and system size, and hence there is no simple scaling observed.
Abstract: We study the statistics of return intervals between events above a certain threshold in multifractal data sets without linear correlations. We find that nonlinear correlations in the record lead to a power-law (i) decay of the autocorrelation function of the return intervals, (ii) increase in the conditional return period, and (iii) decay in the probability density function of the return intervals. We show explicitly that all the observed quantities depend both on the threshold value and system size, and hence there is no simple scaling observed. We also demonstrate that this type of behavior can be observed in real economic records and can be used to improve considerably risk estimation.