scispace - formally typeset
Search or ask a question

Showing papers on "Cumulative distribution function published in 1989"


Journal ArticleDOI
TL;DR: In this article, a class of models indexed by two shape parameters is introduced, both to extend the scope of the standard logistic model to asymmetric probability curves and improve the fit in the noncentral probability regions.

71 citations


Proceedings ArticleDOI
11 Jun 1989
TL;DR: A load control method and load monitor function are proposed for continuous bit-rate-oriented (CBO) and variable bit- rate (VBR) services in asynchronous transfer mode networks and the need to implement a monitor function to verify the assumptions made by the load control is demonstrated.
Abstract: A load control method and load monitor function are proposed for continuous bit-rate-oriented (CBO) and variable bit-rate (VBR) services in asynchronous transfer mode networks. The allocation algorithm features low complexity and optimizes quality of service and statistical multiplexing gain for variable bit-rate sources. The need to implement a monitor function to verify the assumptions made by the load control is demonstrated. Every single connection is monitored and the negative cumulative distribution of the cell rate compared with an envelope. The load monitor function ensures the proper functioning of the load control mechanisms under all input conditions. >

68 citations


Journal ArticleDOI
TL;DR: An alternative model of soft selection is presented which has strikingly different consequences for the resemblance between relatives and the softer the threshold, the more the correlation resembles that in the underlying population.
Abstract: Martin and Wilson (1982) describe two forms of sampling bias in twin studies. One is “hard selection,” where individuals above a threshold participate, and those below do not. The second is “soft selection,” where the probability of including a pair of relatives varies over the range of the character. We present an alternative model of soft selection which has strikingly different consequences for the resemblance between relatives. In general, the softer the threshold, the more the correlation resembles that in the underlying population. Results are presented where the probability of selection equals the cumulative distribution function of a normal distribution with 10% of the variance of the selected variable. In these circumstances, soft selection usually leads to less severely attenuated correlations than truncate selection.

60 citations


Book ChapterDOI
01 Jan 1989
TL;DR: In this article, the cumulative distribution function of a parabolic function of independent standard normal random variables is computed by inversion of the corresponding characteristic function using the saddle-point method in conjunction with the trapezoidal rule.
Abstract: The probability density function and the cumulative distribution function of a parabolic function of independent standard normal random variables are computed by inversion of the corresponding characteristic function. The method uses the saddle-point method in conjunction with the trapezoidal rule. The result is useful in second order reliability analysis.

56 citations


Journal ArticleDOI
TL;DR: In this article, the authors demonstrate that naive application of the usual derivative formula to estimate changes in probabilities caused by changes in independent variables in probabilistic choice models can yield nonsensical results.
Abstract: This note demonstrates that naive application of the usual derivative formula to estimate changes in probabilities caused by changes in independent variables in probabilistic choice models can yield nonsensical results. This can occur because the usual derivative formula does not constrain the estimated change in probability. When the probabilities are large, the usual derivative expression can lead to violations of the constraint that probabilities must sum to one. A better approach is to use the cumulative probability functions to calculate the difference in probabilities. Since this answer is based on the cumulative probability function, the answer cannot be implausibly large in absolute value. This alternative calculation eliminates the inconsistencies that are possible when the derivative expression

53 citations



Journal ArticleDOI
TL;DR: In this article, the limiting cumulative distribution and probability density functions of the least-squares estimator in a first-order autoregressive regression when the true model is near-integrated in the sense of Phillips are tabulated.
Abstract: We tabulate the limiting cumulative distribution and probability density functions of the least-squares estimator in a first-order autoregressive regression when the true model is near-integrated in the sense of Phillips. The results are obtained using an exact numerical method which integrates the appropriate limiting moment generating function. The adequacy of the approximation is examined for various first-order autoregressive processes with a root close to unity.

48 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider the case where a decision maker wishes to maximise E[u(z(x, b))] where x is a random variable and b is a decision variable.
Abstract: A decision maker wishes to pick b to maximise E[u(z(x, b))] where x is a random variable. A shift in the probability density function of x fromf(x) to g(x) is considered where g(x) can be interpreted as representing greater uncertainty. A set of conditions on the behavior of the ratio of f(x) to g(x) over the support of .x are derived sufficient to sign the effect of the shift on the decision variable. In most decision making models there is an element of uncertainty about the outcome. An important question is how changes in the distribution of beliefs about the underlying variables will affect the optimal value of some choice variable. In particular it is of interest to analyse whether any general statements can be made about shifts in the distributions which allow the sign of the effect on the decision variable to be analysed. In this paper we consider the case where a decision maker wishes to maximise E[u(z(x, b))] where x is a random variable and b is a decision variable. For this model the results of Rothschild and Stiglitz (1971) cannot in general be used, as shown by Meyer and Ormiston (1983). Meyer and Ormiston derive a set of sufficient conditions on changes in the distribution of x which permit the effect on the optimal choice of b to be signed. We show that these conditions are unduly restrictive and derive a less stringent set of sufficient conditions. The direct argument of the decision maker's utility function, z, is assumed to be a function of a random variable, x, and the decision variable, b. In the analysis which follows we assume that z(x, b) is such that zx > 0, Zbx 0, Zbb 0 and u"(z) < 0, i.e. the decision maker is risk neutral or risk averse. The random variable, x, is assumed to have an initial distribution characterised by the cumulative distribution function (c.d.f.) F(x) with associated probability density function (p.d.f.), or probability functionf(x). The comparative statics of a shift to a c.d.f. G(x) with associated p.d.f. g(x) are analysed. In the analysis x is assumed to be a continuous random variable but Lemma 1 and Theorem 1 can be adapted for the case where x is discrete. G(x) is assumed to represent a distribution which can be considered to represent greater uncertainty than F(x). The supports of x under G(x) are [x1, x6] and under F(x) are [X2, X5] where x1 < X2 < X5 < X6. If the strict inequalities hold extreme values of x which cannot occur under F(x) become possible under G(x). Meyer and Ormiston

46 citations


Journal ArticleDOI
TL;DR: In this paper, a new statistical theory involving the gamma distribution is presented for the local average rate of dissipation, which leads to results for the high-order moments of the velocity structure function that lie between those predicted by the lognormal model and β model and closely follow those of the multifractal model.
Abstract: A new statistical theory involving the gamma distribution is presented for the local average rate of dissipation er. The gamma distribution for er leads to results for the high‐order moments of the velocity structure function that lie between those predicted by the lognormal model and β model and closely follow those of the multifractal model. Comparisons of results predicted by the gamma distribution are made with previously published experimental data, showing excellent agreement. Based on the gamma distribution for er, the one‐dimensional K distribution is developed as a plausible model for the distribution of energy dissipation e. Some laboratory measurements of velocity fluctuations and certain derived quantities from a modified airjet are also discussed. A good agreement is found by making comparisons of the measured cumulative probability distribution of the squared time derivative of the streamwise velocity component with that predicted by this new model.

44 citations


Journal ArticleDOI
TL;DR: In this article, the Breslow-Crowley bound for the difference between the empirical cumulative hazard and the Kaplan-Meier cumulative hazard estimators of the survival function was used to derive a Berry-Esseen bound for U-statistics.
Abstract: The Berry-Esseen bound for U-statistics, established by Helmers and Van Zwet (1982), is combined with the Breslow-Crowley (1974) bounds for the difference between the empirical cumulative hazard and the Kaplan-Meier cumulative hazard estimators of the survival function to derive a Berry-Esseen bound for the Kaplan-Meier estimator. We show that there exists an absolute quantity K such that the absolute difference between the standardized distribution function of Kaplan-Meier estimator at a fixed time point t and the standard normal cumulative distribution function is bounded above by where S(·) is the survival function and σ1is defined in Lemma 1.

43 citations


Journal ArticleDOI
K. Pahlavan, S.J. Howard1
TL;DR: In this article, the cumulative distribution function (CDF) of the 3dB bandwidth of the frequency correlation function is presented, and experimental results relating the RMS delay spread of the channel and the inverse of the threedB width of the correlation function are given.
Abstract: Frequency responses of the indoor radio channel for 128 locations in an office and a research laboratory are analysed. Some statistics on the number of fades arc determined. The cumulative distribution function (CDF) of the 3dB bandwidth of the frequency correlation function is presented, and experimental results relating the RMS delay spread of the channel and the inverse of the 3dB width of the frequency correlation function are given.

Journal ArticleDOI
TL;DR: In this article, it was shown that a random variable X having an inverse-Gaussian or gamma distribution can be written as a linear combination of X and a chi-square random variable and, conversely, X can be characterized through this relationship.
Abstract: For the case where Y is a length-biased random variable corresponding to a random variable X having an inverse-Gaussian or gamma distribution, it is shown that Y can be written as a linear combination of X and a chi-square random variable and, conversely, X can be characterized through this relationship. Finally, the Wald distribution is characterized. >

Book ChapterDOI
01 Jan 1989
TL;DR: In the past twenty years, the term stochastic dominance (SD) has been used by economists to describe a particular set of rules for ranking random variables as discussed by the authors, which apply to pairs of random variables, and indicate when one is to be ranked higher than the other by specifying a condition which the difference between their cumulative distribution functions (CDF) must satisfy.
Abstract: During the past twenty years, the term stochastic dominance (SD) has been used by economists to describe a particular set of rules for ranking random variables. These rules apply to pairs of random variables, and indicate when one is to be ranked higher than the other by specifying a condition which the difference between their cumulative distribution functions (CDF) must satisfy. Various SD ranking procedures have been employed in both empirical and theoretical analysis. First degree stochastic dominance, second degree stochastic dominance, and Rothschild and Stiglitz’ definition of increasing risk are prominent examples.

Book ChapterDOI
G. A. Whitmore1
01 Jan 1989
TL;DR: In this paper, a decision maker with utility function u(x) for wealth x assigns the following subjective value to an uncertain prospect with cumulative distribution function F(x), assuming that wealth level x is positive and that prospect F has moments of all orders.
Abstract: According to the expected utility axioms, a decision maker with utility function u(x) for wealth x assigns the following subjective value to an uncertain prospect with cumulative distribution function F(x). $$ E(u;F) = \smallint _0^\infty u(x)dF(x) $$ (1) It is assumed here that wealth level x is positive and that prospect F has moments of all orders.

Journal ArticleDOI
TL;DR: In this article, a new method for evaluating the probability density function of the mean energy noise level has been proposed based on statistical properties of the noise of freely flowing cars and is based on the hypothesis of Poisson type traffic flow.

Journal ArticleDOI
TL;DR: In this article, a table of the exact percentage points of a weighted form of the Kolmogorov-Smirnov statistic is given, along with the confidence bands for Δ(x) and G(x+Δ(x)).
Abstract: Let F and G be the cumulative distribution functions corresponding to two independent random variables. Define the shift function, Δ(x), by F(x)=G(x+Δ(x)). Doksum and Sievers (1976) compared two confidence bands for Δ(x). The confidence band they found to be best requires the percentage points of a weighted form of the Kolmogorov-Smirnov statistic. The goal in this paper is to supply a table of some of the exact percentage points.

Journal ArticleDOI
TL;DR: In this article, the first-passage time of a crack is modeled as a random variable time-to-failure problem and the analysis is cast into a first passage time problem.

Journal ArticleDOI
Kevin L. Cook1
TL;DR: Billboard Top 40 singles chart data were examined to determine if the frequency distribution of artist productivity fits either of two laws of scattering, and the Kolmogorov–Smirnov Goodness of Fit test showed the relationships between the theoretical and the observed were not statistically significant for either law.
Abstract: Billboard Top 40 singles chart data were examined to determine if the frequency distribution of artist productivity fits either of two laws of scattering. Data were ranked by artist and the observed cumulative distribution functions were compared with the theoretical cumulative distribution functions for empirical laws developed by Lotka and Bradford. The Kolmogorov–Smirnov Goodness of Fit test showed the relationships between the theoretical and the observed were not statistically significant for either law. However, the marginal deviation from statistical significance may have been due to data contamination in that Billboard charts use some data manipulation rather than strictly objective sales or air play data. © 1989 John Wiley & Sons, Inc.

Book ChapterDOI
01 Jan 1989
TL;DR: In this article, the authors extend the concept of random functions to the notion of distributed functions, and define the first step (definition) is actually a technical matter needed in the second: only measurable functions have distributions.
Abstract: All of the arithmetic associated with testing in the hyper-geometric distribution and with confidence intervals in the binomial distribution (respectively, Lessons 15, 19, Part I) was based on their CDFs. Here the extension of this latter concept will be made in two steps, one emphasizing the type of function which is “random”, the other emphasizing its “distribution”; the first step (definition) is actually a technical matter needed in the second: only measurable functions have distributions.

Book ChapterDOI
01 Jan 1989
TL;DR: In this article, the authors considered an example of a continuous variable, namely the height of a student, and summarized the heights of 50 students by a histogram, Fig. 3.1, reproduced here as Fig. 7.1.
Abstract: In Chapter 3 we considered an example of a continuous variable, namely the height of a student, and we summarized the heights of 50 students by a histogram, Fig. 3.1, reproduced here as Fig. 7.1.

Journal ArticleDOI
TL;DR: The exact probability density function of linear combinations of k=k(n) order statistics selected from the whole order statistics (L-statistic) based on a random sample of size n from the uniform distribution on [0, 1] was derived by Matsunawa.
Abstract: The exact probability density function of linear combinations of k=k(n) order statistics selected from the whole order statistics (L-statistic) based on a random sample of size n from the uniform distribution on [0, 1] was derived by Matsunawa (1985, Ann. Inst. Statist. Math., 37, 1–16). As the main expression for the density function given by Matsunawa is not complete for the general situation, we first provide the corrections for this formula. Second, we propose a simple scheme involving symbolic computing for evaluating the corrected version of the density function. The cumulative distribution function and the r-th mean of his L-statistic are also derived.

Journal ArticleDOI
01 Apr 1989
TL;DR: In this paper, the statistical stability of cumulative distributions of rainfall rate is investigated for the design of reliable radio relay systems using frequencies above 10 GHz, and the dependence of the cumulative distribution on this risk is discussed.
Abstract: Rainfall rate data acquired from the UK Meteorological Office are analysed to investigate the statistical stability of cumulative distributions of rainfall rate. Information on the statistical stability of rainfall rate is useful for the design of reliable radio relay systems using frequencies above 10 GHz. By using the rainfall rate data measured at 10 typical locations which have nearly the same long-term cumulative distributions of rainfall rate, distributions of yearly and worst-month cumulative time percentages at several rainfall rate threshold levels are derived. The distributions of the yearly and the worst-month time percentages are approximated by the Gamma distribution and log-normal distribution, respectively. It is found that the parameters used in these distributions depend on the rainfall rate threshold level. By using this empirical result, a concept of a prediction risk is introduced. The dependence of the cumulative distribution on this risk is discussed.

Journal ArticleDOI
P. Kittl1, G. Díaz1
TL;DR: In this paper, three types of random variables are considered: (1) the properties of the material, (2) the boundary conditions, and (3) external agents, such as cyclic forces that give rise to fatigue, or earthquakes, for computing the cumulative probability of fracture or plastic deformation.

Proceedings ArticleDOI
01 May 1989
TL;DR: It is demonstrated that the PDF of the out- of-band IM interference is not necessarily Gaussian and that this out-of- band IM noise may lead to more frequent peaks and to higher peaks or crest factors than practical Gaussian noise sources.
Abstract: The authors study the probability distribution function (PDF) and cumulative probability distribution function (CPDF) of the IM (intermodulation) products in several nonlinearly amplified systems by computer simulations and hardware measurements. The statistics of the in-band and out-of-band IM products are found. Particular emphasis is given to the study of the CPDF of out-of-band IM products since it is shown that these can be the major cause of adjacent-channel-interference (ACI). It is demonstrated that the PDF of the out-of-band IM interference is not necessarily Gaussian and that this out-of-band IM noise may lead to more frequent peaks and to higher peaks or crest factors than practical Gaussian noise sources. Thus the BER (bit error rate) will be worse than that in the WGN (white Gaussian noise) case. On the contrary, in the case of in-band IM noise, the BER performance is better than that of the WGN case when C/N (carrier/noise ratio) is high enough. >

Journal ArticleDOI
TL;DR: The method gives improved characterization of analytical data distributions, particularly in the distribution extremities, and avoids the biases from improper handling of censored data arising from measurements near the analytical detection limit.
Abstract: A numerical method was developed for estimating the shapes of unknown distributions of analytical data and for estimating the expected values of censored data points. The method is based conceptually on the normal probability plot. Data are ordered and then transformed by using a power function to achieve approximate linearity with respect to a computed normal cumulative probability scale. The exponent used in the power transformation is an index of the distribution shape, which covers a continuum on which normality is defined as d = 1 and log normality is defined as d = 0. Expected transformed values of censored points are computed from a straight line fitted to the transformed, accepted data, and these are then back-transformed to the original distribution. The method gives improved characterization of analytical data distributions, particularly in the distribution extremities. It also avoids the biases from improper handling of censored data arising from measurements near the analytical detection limit. Illustrative applications were computed for atmospheric SO2 data and for mineral concentrations in hamburgers.

Journal ArticleDOI
W.B. Joyce1
TL;DR: In this article, the standard deviation of the natural logarithm of the lifetime and the median of the life cycle were proposed as generic parameters for semiconductor-device lifetimes.
Abstract: In statistics and in reliability it is conventional to define a set of parameters for each cumulative distribution function. In contrast, here the use of generic parameters is considered. In particular, the standard deviation of the natural logarithm of the lifetime and the median of the lifetime are shown to be attractive generic parameters for semiconductor-device lifetimes. These two generic parameters are then evaluated for Weibull, exponential, lognormal, and nonparametric representations of the lifetime distribution. >

Book
31 Oct 1989
TL;DR: This one-of-a-kind work includes a brief introductory section which outlines the inverse Gaussian distribution and explains the tables, and uses the closed form expression for the cumulative distribution function.
Abstract: The purpose of this handbook is to provide comprehensive tables of percentage points of the inverse Gaussian distribution. There is no other publication available today which condenses these tables - to such extent-in a concise, straightforward manner. The inverse Gaussian distribution is not only important for determining boundary crossing probabilities of Brownian Motion, which probabilities determine the operating characteristics of many sequential sampling procedures in statistics. It is also used in quality control procedures. This one-of-a-kind work includes a brief introductory section which outlines the inverse Gaussian distribution and explains the tables. The tables are produced in a fine grid of cumulative probabilities, and uses the closed form expression for the cumulative distribution function. This easy-to-use table reference also includes an excellent discussion of searching ordered tables. This handbook is a helpful, indispensable guide for all who are involved with statistics, mathematics, and computers. Mechanical engineers and physicists will find it useful also.

Book ChapterDOI
01 Jan 1989
TL;DR: In this paper, the authors present a stochastic dominance technique which can be used to quantify differences in cumulative probability distributions of data, and demonstrate this technique by quantifying the probability of default as assessed by the bond market.
Abstract: The purpose of this paper is two-fold: (1) to present a stochastic dominance technique which can be used to quantify differences in cumulative probability distributions of data, and (2) to demonstrate this technique by quantifying the probability of default as assessed by the bond market. We suggest, then, that the contribution of this paper lies in its introduction of a new methodology which we then use to answer a question in economics and finance.

Proceedings ArticleDOI
20 Sep 1989
TL;DR: In this paper, a log-normal underlying distribution over a linear distribution was chosen for forward-looking infrared (FUR) detection theory, since the threshold value is identical for both distributions and the distributions are essentially the same from 20 to 80 percent cumulative probability.
Abstract: The assumed underlying probability of detection distribution is important when trying to estimate the mean of the population from a limited sample such as that obtained from minimum resolvable temperature (MRT) tests. If the underlying population is normally distributed, then the best estimate of the population mean is the arithmetic average of the observers' individual MRT values. On the other hand, nearly all visual psychophysical data is plotted on a logarithmic scale with log intensity increments. Thus it is reasonable to assume that MRT responses should also be treated as a logarithmic response. With a log-normal distribution, the population mean is estimated from the geometric average of the observers' responses. Choosing a log-normal underlying distribution over a linear distribution does not affect existing forward-looking infrared (FUR) detection theory, since the threshold value is identical for both distributions and the distributions are essentially the same from 20 to 80 percent cumulative probability. The linear normal distribution is not bounded and therefore, mathematically, can provide a finite probability of detection for unrealizable negative signal-to-noise ratios. Since the log-normal distribution is bounded to positive values it appears to adequately represent the real world. To determine the underlying distribution for the standard MRT four-bar target, 2700 detection responses were obtained for various signal-to-noise ratios and spatial frequencies. Although any individual observer may have a linear distribution, the resultant composite data easily fits a log-normal distribution. Furthermore, limited laboratory data on actual FLIRs also indicate that a log-normal distribution provides a better mathematical approximation. As a result, the geometric average of observer responses is recommended for determining the composite MRT value.© (1989) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Journal ArticleDOI
TL;DR: In this article, a class of nonparametric test statistics for testing the equality of c continuous multivariate cumulative distribution functions against the alternative that they differ in location/scale simultaneously is proposed and studied.
Abstract: A class of nonparametric test statistics for testing the equality of c continuous multivariate cumulative distribution functions against the alternative that they differ in location/scale simultaneously is proposed and studied. The asymptotic distribution theory is developed for the statistics, both under the null hypothesis and the alternative. The asymptotic relative efficiency of the statistics is considered.