scispace - formally typeset
Search or ask a question

Showing papers on "Cumulative distribution function published in 1996"


Book
28 Feb 1996
TL;DR: Elements of Probability Theory as mentioned in this paper include Random Variables, Densities, and Cumulative Distribution Functions, Mathematical Expectation and Moments, Parametric families of Density Functions, and Basic Asymptotics.
Abstract: Elements of Probability Theory.- Random Variables, Densities, and Cumulative Distribution Functions.- Mathematical Expectation and Moments.- Parametric Families of Density Functions.- Basic Asymptotics.- Sampling, Sample Moments, Sampling Distributions, and Simulation.- Elements of Point Estimation Theory.- Point Estimation Methods.- Elements of Hypothesis-Testing Theory.- Hypothesis-Testing Methods.- Appendices.- Index.

145 citations


Journal ArticleDOI
TL;DR: In this paper, an analysis of evacuation life safety in a one-room public assembly building has been carried out with regard to uncertainty and risk, and the importance analysis carried out analytically gives data of fundamental significance for an understanding of the practical design problem.

92 citations


Journal ArticleDOI
TL;DR: In this paper, the authors characterise priors which match, up to O(n- 1), the posterior joint cumulative distribution function of multiple parametric functions with the corresponding frequentist cumulative distribution functions.
Abstract: SUMMARY We characterise priors which match, up to O(n- 1), the posterior joint cumulative distribution function of multiple parametric functions with the corresponding frequentist cumulative distribution function. This work extends and unifies the work of Ghosh & Mukerjee (1993) and Datta & Ghosh (1995a) on the topic of probability-matching priors. A set of necessary and sufficient conditions is obtained for the above characterisation. Some of these conditions depend only on the parametric functions and not on the prior. Examples are given where the joint probability matching is possible and where it is not possible.

59 citations


Journal ArticleDOI
TL;DR: From the theoretical results are constructed nonparametric estimators of the Kaplan-Meier type and cumulative-hazard type that are used to characterize the distribution function and obtain the properties that any function must have to be a generalized failure rate function, both for continuous and discrete random variables.
Abstract: The failure rate function is generalized for doubly-truncated random variables. This generalized function is used to characterize the distribution function and to obtain the properties that any function must have to be a generalized failure rate function, both for continuous and discrete random variables. From the theoretical results are constructed nonparametric estimators of the Kaplan-Meier type and cumulative-hazard type.

47 citations


Journal ArticleDOI
TL;DR: In this article, an extension of a data-based power estimation method presented by Collings and Hamilton (1988) was investigated, which requires no prior knowledge of the underlying population distributions other than necessary to perform the Kruskal-Wallis test for a location shift.
Abstract: Power calculations of a statistical test require that the underlying population distribution(s) be completely specified. Statisticians, in practice, may not have complete knowledge of the entire nature of the underlying distribution(s) and are at a loss for calculating the exact power of the test. In such cases, an estimate of the power would provide a suitable substitute. In this paper, we are interested in estimating the power of the Kruskal-Wallis one-way analysis of variance by ranks test for a location shift. We investigated an extension of a data-based power estimation method presented by Collings and Hamilton (1988), which requires no prior knowledge of the underlying population distributions other than necessary to perform the Kruskal-Wallis test for a location shift. This method utilizes bootstrapping techniques to produce a power estimate based on the empirical cumulative distribution functions of the sample data. We performed a simulation study of the extended power estimator under the conditions of k = 3 and k = 5 samples of equal sizes m = 10 and m = 20, with four underlying continuous distributions that possessed various location configurations. Our simulation study demonstates that the Extended Average × & Y power estimation method is a reliable estimator of the power of the Kruskal-Wallis test for k = 3 samples, and a more conservative to a mild overestimator of the true power for k = 5 samples.

38 citations


Journal ArticleDOI
TL;DR: The computation of the cumulative distribution, the complementary cdf, and the density of certain shot-noise random variables is discussed and a general method for approximating samples of a cdf or ccdf is approximated by summing a Fourier series whose coefficients are modulated samples of their characteristic function.
Abstract: The computation of the cumulative distribution (cdf), the complementary cdf (ccdf), and the density of certain shot-noise random variables is discussed. After subtracting off a few terms that can be computed in closed form, what remains can be approximated by a general method for approximating samples of a cdf or ccdf by summing a Fourier series whose coefficients are modulated samples of their characteristic function. To approximate the density, a spline is fit to the cdf samples and then differentiated. When the density has corners, it is important that the spline have coincident knots at these locations. For shot-noise densities, these locations are easily identified.

36 citations


Journal ArticleDOI
TL;DR: In this paper, a formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space ( S st, L st, Pst) for stochastic uncertainty, a probability spaces ( S su, L su, Psu) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st, l st, pst) and (S su, l su, psu).

34 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented an easily applied statistical methodology for deriving the theoretical distribution of the persistence of marine environmental parameters from relatively short records, based on the assumption that the process may be regarded as a stationary first order Markov process whose transition matrix may be established from the given record.

29 citations


Journal ArticleDOI
TL;DR: In this article, a new measure of ordinal variation, the LSQ, is developed using a geometric representation involving the cumulative distribution function, and connections among it and previously suggested measures, the LOV, IOV, and COV, are clarified.
Abstract: A new measure of ordinal variation, the LSQ, is developed using a geometric representation involving the cumulative distribution function. Connections among it and previously suggested measures, the LOV, IOV, and COV, are clarified. This geometric perspective helps demonstrate that all these statistics measure the distance between the observed cumulative distribution and that corresponding to the maximally dispersed distribution, given the sample size and the number of categories for the ordinal variable. From this perspective, it is clear that none of these measures relies on supra-ordinal assumptions concerning intercategory distances. Recent questions concerning scale invariance and unreasonable values for these measures are also clarified.

24 citations


Journal ArticleDOI
Atsushi Hiraiwa1, Makoto Ogasawara1, Nobuyoshi Natsuaki1, Yutaka Itoh1, Hidetoshi Iwai1 
TL;DR: In this paper, a statistical model to investigate the distribution of dynamic random access memory data retention times is proposed, which assumes that the retention time is determined by a junction leakage current generated at carrier traps by a Shockley-Read-Hall process, and the trap levels are randomly distributed not only among the memory cells but also within a cell.
Abstract: A statistical model to investigate the distribution of dynamic random access memory data retention times is proposed. The model assumes that the retention time is determined by a junction leakage current generated at carrier traps by a Shockley–Read–Hall process, and that the trap levels are randomly distributed not only among the memory cells but also within a cell. Monte Carlo results based on the model were in excellent agreement with experimental results, which confirmed the validity of the model. An analytical expression of the retention time distribution was also derived, and proved a good approximation near the 50% cumulative probability. Based on the model, variation in the retention time distributions among samples was found to be related to different trap‐level distributions at the SiO2/Si interface.

23 citations


Journal ArticleDOI
TL;DR: In this paper, a new construction of the Gaussian distribution is introduced and proven, which consists of using fractal interpolating functions, with graphs having increasing fractal dimensions, to transform arbitrary continuous probability measures defined over a closed interval.
Abstract: A new construction of the Gaussian distribution is introduced and proven. The procedure consists of using fractal interpolating functions, with graphs having increasing fractal dimensions, to transform arbitrary continuous probability measures defined over a closed interval. Specifically, let X be any probability measure on the closed interval I with a continuous cumulative distribution. And let f Θ,D :I → R be a deterministic continuous fractal interpolating function, as introduced by Barnsley (1986), with parameters Θ and fractal dimension for its graph D. Then, the derived measure Y = f Θ,D (X) tends to a Gaussian for all parameters Θ such that D → 2, for all X. This result illustrates that plane-filling fractal interpolating functions are ‘intrinsically Gaussian'. It explains that close approximations to the Gaussian may be obtained transforming any continuous probability measure via a single nearly-plane filling fractal interpolator.

Journal ArticleDOI
TL;DR: In this paper, the distribution of joint spacings in a granitic massive in Saudi Arabia is found to be well-described by a power-law with characteristic exponent μ ≃ 0.5.
Abstract: The distribution of joint spacings in a granitic massive in Saudi Arabia is found to be well-described by a power-law with characteristic exponent μ ≃ 0.5. We compare the cumulative and density distributions and show how to correct the cumulative distribution for bias due to the finite sampling size. The exponent μ is close to those obtained for size distribution in fragmentation processes. We study simple models of fragmentation/jointing processes, which predict that the power law distribution must be decorated by a log-periodic modulation if the fragmentation involves a preferred ratio (even approximately so, i.e. with disorder) corresponding to an approximate discrete scale invariance. We corroborate this prediction by carrying out a more detailed analysis of the density distribution and find at least 6 log-periodic oscillations. This implies that the exponent μ possesses an imaginary part, embodying the existence of an average discrete scaling structure with preferred fragmentation ratio close to 1/2. The confidence level of this result is found better than 97 % from synthetic tests.

Journal ArticleDOI
Kenneth R. Baker1
TL;DR: The probability density function used to model the statistical behavior of avalanche photodiodes (APD), known as the Webb, McIntyre and Conradi (1974) WMC density, is shown to be an inverse Gaussian density, and a closed-form solution for the cumulative distribution function exists.
Abstract: The probability density function (PDF) used to model the statistical behavior of avalanche photodiodes (APD), known as the Webb, McIntyre and Conradi (1974) WMC density, is shown to be an inverse Gaussian density. As a consequence, a closed-form solution for the cumulative distribution function exists. A closed-form for the cumulative distribution of the WMC PDF has not been presented in the literature before. With this result, alternate methods for the generation of WMC PDF distributed random variates are now available. These methods are reviewed and discussed.

Journal ArticleDOI
TL;DR: A method that does not use numerical integration is presented for approximating the cumulative distribution of integer-valued random variables from their characteristic functions and is used to compute photomultiplier counting distributions.
Abstract: A method that does not use numerical integration is presented for approximating the cumulative distribution of integer-valued random variables from their characteristic functions. Bounds on the approximation error are also given. The method is then used to compute photomultiplier counting distributions.

Journal ArticleDOI
TL;DR: In this article, it was shown that the probability that G ∈ G(n, p) has no triangles is asymptotic to exp(− 6p 3n3 + 14p 5n4 − 7 12p 7n5) for p = o(n−2/3).
Abstract: When a discrete random variable in a discrete space is asymptotically Poisson, there is often a powerful method of estimating its distribution, by calculating the ratio of the probabilities of adjacent values of the variable. The versatility of this method is demonstrated by finding asymptotically the probability that a random graph has no triangles, provided the edge density is not too large. In particular, the probability that G ∈ G(n, p) has no triangles is asymptotic to exp(− 6p 3n3 + 14p 5n4 − 7 12p 7n5) for p = o(n−2/3), and for G ∈ G(n,m) it is asymptotic to exp(− 6d 3n3) for d = 2m n(n−1) = o(n −2/3).

Journal ArticleDOI
TL;DR: The techniques and expressions developed for the robust, efficient, and accurate calculation of the P/sub N/(X, Y), or equivalently, the generalized Marcum (1960) Q-function, are extended to the calculation ofThe noncentral chi-square distribution function and its complement.
Abstract: Generalizations of two known noncentral chi-square results are presented. The first generalization concerns extending the expression for the probability that one (first-order) Ricean random variable exceeds another to the expression that one nth-order Ricean random variable exceeds another nth-order Ricean random variable. This latter result is shown to be equivalent to the probability of one noncentral chi-square random variable with 2n degrees of freedom exceeding another. The form of the resulting expression is such that it can easily be evaluated by a recurrence relation. The second generalization deals with the fact that the noncentral chi-square distribution function, with d degrees of freedom, differs from the complementary probability of detection, 1-P/sub N/(X, Y), only in that the latter is restricted to even degrees of freedom. We can expand the techniques and expressions developed for the robust, efficient, and accurate calculation of the P/sub N/(X, Y), or equivalently, the generalized Marcum (1960) Q-function, to the calculation of the noncentral chi-square distribution function and its complement.

Journal ArticleDOI
TL;DR: A recent paper presented a new efficient data structure, the ‘binary endexed tree’, for maintaining the cumulative frequencies for arithmetic data compression, but significant problems remained with the method for determining the symbol corresponging to a known frequemcy.
Abstract: A recent paper presented a new efficient data structure, the ``Binary Indexed Tree'''', for maintaining the cumulative frequencies for arithmetic data compression. While algorithms were presented for all of the necessary operations, significant problems remained with the method for determining the symbol corresponding to a known frequency. This report corrects that deficiency. This report is being filed also with the Department of Computer Science, The University of Auckland, as Technical Report 110, ISSN 1173-3500.

Journal ArticleDOI
TL;DR: In this article, higher-order terms in the double-saddlepoint expansion of Skovgaard for a unidimensional conditional cumulative distribution function were derived for continuous and lattice random variables.
Abstract: This paper derives higher-order terms in the double-saddlepoint expansion of Skovgaard for a unidimensional conditional cumulative distribution function. Expansions for continuous and lattice random variables are derived. Results are applied to the sufficient statistic in logistic regression.

Journal ArticleDOI
P. Wu, P. Fang, L. Wu, Q. Tao, Y. Yang 
TL;DR: In this article, a distribution function of rice yield deviations from the mean was developed from field experiments with 555 plots at 16 sites in Zhejiang province, China, for three years.
Abstract: A distribution function of rice yield deviations from the mean was developed from field experiments with 555 plots at 16 sites in Zhejiang province, China, for three years. The deviation distribution in interval of 50kg/ha appeared as a symmetrical distribution with a high peak (Mean=0.279 [kg/ha], SD=240.686 [kg/ha]). Normality test using Kolmogrove-Smirnov test between the observed cumulative distribution and the normal cumulative distribution function indicates that the observed deviation distribution is not normal. An empirical exponential cumulative distribution function was developed. The distribution function was used to remove outliers during the development of a rice yield fertilizer response model, based on data from a non-replicated NPK field experiment.

Journal ArticleDOI
TL;DR: This work presents a method where the measured dynamic properties of substructures are used instead as the random variables and the residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues.
Abstract: Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. We present a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

Book ChapterDOI
01 Jan 1996
TL;DR: The S-distribution as mentioned in this paper is defined in the form of a four-parameter nonlinear differential equation, with the cumulative distribution function of the survival time as the dependent variable and the survival times as the independent variable.
Abstract: The S-distribution is defined in the form of a four-parameter nonlinear differential equation, with the cumulative distribution function of the survival time as the dependent variable and the survival time as the independent variable. The first parameter characterizes the location, the second the scale, and the other two the shape of the model. The S-distribution covers the logistic distribution and the exponential distribution as special cases and approximates other common survival models with rather high precision. The S-distribution is used to classify common survival distributions within a two-dimensional space in which characteristics related to the shape of the density function and the hazard function can be studied. Nonlinear regression methods are used in the classification procedure.

Journal ArticleDOI
TL;DR: In this article, a simple interpretation of multifractality in terms of global c-d-f- scaling is shown to collapse the inertial range c − d-f − into a single curve, directly related to the codimension.
Abstract: The probabilistic reformulation of the multifractal model iii is obtained directly from the structure functions written as integrals over cumulative distribution functions (c.d.f.) by the steepest descent method. The saddle point being a function of scale, we perform a change of variable to obtain expressions that are asymptotically valid in the inertial range. Starting directly from the inertial range behavior of the c-d-f-, our algorithm yields values for the scaling exponents and codimension that are identical to those obtained from structure functions. Furthermore, a simple interpretation of multifractality in terms of global c-d-f- scaling is shown to collapse the inertial range c-d-f- into a single curve, directly related to the codimension. Our method determines a new length scale, larger than the integral scale, that gives a quantitative measure of the degree of multifractality of the data. Finally, some possible future applications are mentioned.

Journal ArticleDOI
TL;DR: In this paper, a technique based on Toeplitz matrices was used to calculate the probability distribution for certain random walks on a lattice in continuous time where the walker can take steps of various sizes in each direction and where the probability of a step depends on the nature of a finite set of previous steps.
Abstract: We use a technique based on Toeplitz matrices to calculate the probability distribution for certain random walks on a lattice in continuous time where the walker can take steps of various sizes in each direction and where the probability of a step depends on the nature of a finite set of previous steps. If k ( ij ) is the rate constant for a step of j units given a history of type i , then we can solve the random walk problem for the special case when the sum over k ( ij ) is independent of j .

ReportDOI
01 Mar 1996
TL;DR: A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components as mentioned in this paper.
Abstract: A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

Journal ArticleDOI
Xiaoping Xiong1
TL;DR: In this article, a new method is proposed to obtain absorption probability distributions of random paths on any boundaries or barrier sets, where the random path is a sequence of partial sums of observations from the population.
Abstract: The sum of observations from a population is an important statistic. However its distribution in a sequential analysis is usually mathematically untractable when the stopping rule depends on the observed outcomes, which makes statistical analyses difficult. To overcome the difficulty, a new method is proposed to obtain absorption probability distributions of random paths on any boundaries or barrier sets, where the random path is a sequence of partial sums of observations from the population. By this method, sanlple spaces for sequential rules are interpreted as partitions of a common sample space. The absorption probability on the boundary is the product of a latent function and the likelihood function of the random path, where the latent function is defined as the conditional absorption probability. Latent functions for a variety of stopp~ng rules are given as iterative convolutions over boundaries. Though this method is general, our discussion is focused on finite populations. Advantages of the method ...

Journal ArticleDOI
TL;DR: Three idealized model probability distributions for this problem are tested and a hypergeometric distribution where the effects of mutual exclusion between particles are incorporated by a coverage dependent effective volume per particle is shown to give best agreement with simulation results.
Abstract: The distribution of the number of hard-core objects deposited in a finite region through a random sequential adsorption process is analyzed. Three idealized model probability distributions for this problem are tested by comparing them with computer simulation distributions for the random sequential adsorption of hard-disks on a flat surface. A hypergeometric distribution where the effects of mutual exclusion between particles are incorporated by a coverage dependent effective volume per particle is shown to give best agreement with simulation results. This effective volume is derived in terms of the fluctuations in the particle number. Finite size effects in the fluctuations are taken into account.

Journal ArticleDOI
B. Rauhut1
TL;DR: In this article, the authors considered the non-degenerated limit distributions for the n-fold mapping of a given probability distribution, where the mapping used for the iteration procedure is a probability generating function of a positive integer-valued random variable.
Abstract: In this paper the possible nondegenerated limit distributions for the n-fold mapping of a given probability distribution are considered. If the mapping used for the iteration procedure is a probability generating function of a positive integer-valued random variable then the results can be applied to the max-stability of distributions of random variables with random sample size.

Journal ArticleDOI
TL;DR: In this article, the concept of driving-periodic random environment reflecting an accumulated risk is developed, which suggests that the random environment has effects on accumulating and increasing the risk that some event will happen within any specific period of length c > 0.
Abstract: The concept of driving-periodic random environment reflecting an accumulated risk is developed. We suggest that the random environment has effects on accumulating and increasing the risk that some event will happen within any specific period of length c > 0. The waiting time until the event occurs defines a random variable of a special kind whose probability distribution and additional properties are studied. The form of this probability distribution is derived, and some useful representations of related random variables are obtained. Phenomena of this type appear in a series of environmental, maintenance and financial processes. An illustrative example is briefly discussed.

Journal ArticleDOI
TL;DR: In this article, the authors present some useful results in constructing statistical confidence regions for the entire failure cumulative distribution function (cdf), F(x) from which a random sample has been drawn.

01 Aug 1996
TL;DR: A method is presented for the generation of realizations of zero mean non-Gaussian random time histories with a specified ASD, and a method to generate a realization of a Gaussian distributed waveform with a known ASD.
Abstract: It is recognized that some dynamic and noise environments are characterized by time histories which are not Gaussian. An example is high intensity acoustic noise. Another example is some transportation vibration. A better simulation of these environments can be generated if a zero mean non-Gaussian time history can be reproduced with a specified auto (or power) spectral density (ASD or PSD) and a specified probability density function (pdf). After the required time history is synthesized, the waveform can be used for simulation purposes. For example, modem waveform reproduction techniques can be used to reproduce the waveform on electrodynamic or electrohydraulic shakers. Or the waveforms can be used in digital simulations. A method is presented for the generation of realizations of zero mean non-Gaussian random time histories with a specified ASD, and pdf. First a Gaussian time history with the specified auto (or power) spectral density (ASD) is generated. A monotonic nonlinear function relating the Gaussian waveform to the desired realization is then established based on the Cumulative Distribution Function (CDF) of the desired waveform and the known CDF of a Gaussian waveform. The established function is used to transform the Gaussian waveform to a realization of the desired waveform. Sincemore » the transformation preserves the zero-crossings and peaks of the original Gaussian waveform, and does not introduce any substantial discontinuities, the ASD is not substantially changed. Several methods are available to generate a realization of a Gaussian distributed waveform with a known ASD. The method of Smallwood and Paez (1993) is an example. However, the generation of random noise with a specified ASD but with a non-Gaussian distribution is less well known.« less