scispace - formally typeset
Search or ask a question

Showing papers on "Cumulative distribution function published in 1972"


Book
01 Jan 1972
TL;DR: An Introduction to Probability, Design and Analysis of Single-Factor Experiments: The Analysis of Variance, and some Important Discrete Distributions.
Abstract: An Introduction to Probability.One-Dimensional Random Variables.Functions of One Random Variable and Expectation.Joint Probability Distributions.Some Important Discrete Distributions.Some Important Continuous Distributions.The Normal Distribution.Introduction to Statistics and Data Description.Random Samples and Sampling Distributions.Parameter Estimation.Tests of Hypotheses.Design and Analysis of Single-Factor Experiments: The Analysis of Variance.Design of Experiments with Several Factors.Simple Linear Regression and Correlation.Multiple Regression.Nonparametric Statistics.Statistical Quality Control and Reliability Engineering.Stochastic Processes and Queueing.Computer Simulation.Appendix.References.Answers to Selected Exercises.Index.

240 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the problem of finding the expected value of functions of a random variable X of the form f(X) = (X+A)−n where X+A>0 a.s.
Abstract: We investigate the problem of finding the expected value of functions of a random variable X of the form f(X) = (X+A)−n where X+A>0 a.s. and n is a non-negative integer. The technique is to successively integrate the probability generating function and is suggested by the well-known result that successive differentiation leads to the positive moments. The technique is applied to the problem of finding E[1/(X+A)] for the binomial and Poisson distributions.

145 citations


Journal ArticleDOI
TL;DR: A sequence of transforming functions is proposed to convert nongaussian distributions often seen in laboratory data to gaussian form, which enables smooth curves to be drawn through observed cumulative distributions plotted on arithmetic or gaussian probability scales.
Abstract: A sequence of transforming functions is proposed to convert nongaussian distributions often seen in laboratory data to gaussian form. These transforms are chosen to eliminate or substantially reduce nongaussian characteristics of positive skewness and peakedness that result from two factors: ( a ) increases in variance with increasing mean values, and ( b ) general heterogeneity among intrapersonal variances. Use of these transforms, demonstrated on many sets of clinical laboratory data, enables smooth curves to be drawn through observed cumulative distributions plotted on arithmetic or gaussian probability scales. From such curves, normal ranges or proportions below a specified measurement may be estimated easily and with greater precision than possible through nonparametric methods. Formulas are given for obtaining confidence limits corresponding to these estimates. The entire process of transforming the original variable to gaussian form and graphing the cumulative distribution curve has been computerized. Programs are available to others interested in applying these methods.

91 citations


Journal ArticleDOI
TL;DR: In this paper, a priori assumptions about the degree of smoothness of the probability density to be estimated were made and they were used to construct estimates of a probability density function itself and its derivatives which are distinguished by the high rate of decrease of the error in the estimate as the sample size increases.
Abstract: We investigate statistical estimates of a probability density distribution function and its derivatives. As the starting point of the investigation we take a priori assumptions about the degree of smoothness of the probability density to be estimated. By using these assumptions we can construct estimates of the probability density function itself and its derivatives which are distinguished by the high rate of decrease of the error in the estimate as the sample size increases.

85 citations



Journal ArticleDOI
TL;DR: In this paper, the cumulative probability distribution function of the roll rate of a single-stage vehicle is derived for three and four-finned, single-car single-vehicle vehicles.
Abstract: L~~-——I" 10 20 30 TIME, sec 40 Using basic statistical techniques, the cumulative probability distribution function F(p;t) of the roll rate p at time t is obtained. This function gives, at each time t, the probability that the roll rate is less than or equal to any value p. Closed- form solutions for F(p; t) are obtained for three and four-finned, single-stage vehicles. The detailed derivations of F(p; t) are given in Ref. 1. The results for a four-finned, single-stage vehicle are given below. We have = P = :12 + :

21 citations


Journal ArticleDOI
TL;DR: In this paper, the joint cumulative probability integral associated with random variables Fk = (Xk/rk)/(Y/s), k = 1, 2, * * *, n, where the Xx and Y are independently x'(rk) and x%(s), respectively.
Abstract: Methods for evaluating the joint cumulative probability integral associated with random variables Fk= (Xk/rk)/(Y/s), k = 1, 2, * * *, n, are considered where the Xx and Y are independently x'(rk) and x%(s), respectively. For n - 2, series representations in terms of incomplete beta distributions are given, while a quadrature with efficient procedures for the integrand is presented for n 2 2. The results for n - 2 are applied to the evaluation of the correlated bivariate F distribution.

21 citations


Journal ArticleDOI
01 Dec 1972-Metrika
TL;DR: In this article, the transient behaviour of a queueing problem where the arivals at two consecutive transition marks are correlated, the queue discipline is first-come-first-served, the service time distribution is exponential, and the capacity of the service channel is a random variable is considered.
Abstract: This paper considers the transient behaviour of a queueing problem wherein (i) the arivals at the two consecutive transition marks are correlated, (ii), the queue discipline is first-come-firstserved, (iii) service time distribution is exponential, and (iv) the capacity of the service channel is a random variable. TheLaplace transforms of the probability generating functions are obtained for two models. Finally, some particular cases are derived.

10 citations


Journal ArticleDOI
TL;DR: In this paper, the scalar and vector partitions of the ranked probability score, RPS, are compared and compared, and it is shown that the vector partition of the RPS can also be considered to provide measures of the reliability and resolution of vector forecasts.
Abstract: dcalar and vector partitions of the ranked probability score, RPS, are described and compared. These partitions are formulated in the same manner as the scalar and vector partitions of the probability score, PS, recently described by Murphy. However, since the RPS is defined in terms of cumulative probability distri- butions, the scalar and vector partitions of the RPS pro- vide measures of the reliability and resolution of scalar and vector cumulative forecasts, respectively. The scalar and vector partitions of the RPS provide similar, but not equivalent (i.e., linearly related), measures of these attri- butes. Specifically, the reliability (resolution) of cumula- tive forecasts according to the scalar partition is equal to or greater (less) than their reliability (resolution) accord- ing to the vector partition. A sample collection of forecasts ~~ ~ is used to illustrate the differences between the scalar and vector partitions of the RPS and between the vector partitions of the RPS and the PS. Several questions related to the interpretation and use of the scalar and vector partitions of the RPS are briefly discussed, including the information that these partitions provide about the reliability and resolution of forecasts (as opposed to cumulative forecasts) and the relative merits of these partitions. These discussions indicate that, since a one-to-one correspondence exists between vector and vector cumulative forecasts, the vector partition of the RPS can also be considered to provide measures of the reliability and resolution of vector forecasts and that the vector partition is generally more appropriate than the scalar partition.

9 citations


Journal ArticleDOI
TL;DR: In this paper, the cumulative probability distributions for diffusion-controlled coarsening of spherical precipitates, grain-boundary precipitates and fibers in unidirectionally solidified eutectic alloys are derived.

8 citations


Journal ArticleDOI
TL;DR: In this article, it is shown that when making a decision concerning the probability distribution of a random variable by means of observing this random variable, one is recommended by the statisticians to consider certain functions of the operating characteristic (O.C.) of the decision function as measures of the reliability of the actual decision made.
Abstract: It is a fact that when one is making a decision concerning the probability distribution of a random variable by means of observing this random variable, one is recommended by the statisticians to consider certain functions of the operating characteristic (O.C.) of the decision function as measures of the reliability of the actual decision made. For instance, the confidence coefficient of an interval estimator will as a rule be regarded as a measure of our confidence in the interval.

Journal ArticleDOI
TL;DR: In this article, a mathematical formulation based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics is proposed to evaluate the probability of fracture of a stress fracture.

Journal ArticleDOI
01 Nov 1972
TL;DR: In this article, a numerical method for determining the cumulative probability distribution of a nonnegative random variable is based on the steepest descent approximation of the inverse Laplace transform of its moment-generating function.
Abstract: A numerical method for determining the cumulative probability distribution of a nonnegative random variable is based on the steepest descent approximation of the inverse Laplace transform of its moment-generating function. Good numerical agreement with the cumulative exponential and Poisson distributions is demonstrated.

Journal ArticleDOI
TL;DR: A new sampling method, conditional bit sampling, which is suited for hardwired sampling devices because of its generality, simplicity, and accuracy, and agreement between actual and theoretical performance was excellent.
Abstract: Many cases arise in practice where a versatile hardwired pseudorandom number or pseudonoise generator would be extremely useful. General-purpose pseudonoise devices are not available today. We present a new sampling method, conditional bit sampling, which is suited for hardwired sampling devices because of its generality, simplicity, and accuracy. Random variables sampled from an arbitrary distribution are generated bit by bit from high- to low-order bits with the conditional bit algorithm. The result of a comparison of a uniform number to a conditional probability determines whether a bit in the sampled random number is set to one. The conditional probabilities are easily calculated for any probability distribution and must be arranged in special order. Simple Fortran programs make all necessary computations. Agreement between actual and theoretical performance of the conditional bit algorithm was excellent when sampling accuracy was evaluated for several examples of continuous and discrete densities. Sampling from empirically known, perhaps erratic-shaped, densities presents no problems. Only a small memory containing the conditional probabilities needs to be changed to alter the sampled distribution. The conditional bit algorithmic process always remains the same.

Journal ArticleDOI
TL;DR: In this paper, the distribution of a certain sum of a normal random variable and the reciprocal of a gamma random variable is examined in the course of doing a Bayesian analysis of the lognormal process.
Abstract: Properties of the distribution of a certain sum of a normal random variable and the reciprocal of a gamma random variable are examined. This sum appears naturally in the course of doing a Bayesian analysis of the lognormal process.

01 Aug 1972
TL;DR: In this article, the authors investigated the behavior of cumulative distribution functions which are defined on an abstract linearly ordered space and showed how a number of nonparametric statistical procedures can be extended to include situations of multivariate and time dependent data.
Abstract: : The author investigates the behavior of (univariate) cumulative distribution functions which are defined on an abstract linearly ordered space. Special emphasis is given to the study of a class of linearly ordered spaces which J.H.B. Kemperman introduced into the subject of monparametric tolerance regions. Distribution functions on such spaces can be decomposed. Considerable attention is given to applications. In particular, it is shown how a number of nonparametric statistical procedures can be extended to include situations of multivariate and time dependent data. (Author)

01 Jun 1972
TL;DR: In this article, the authors presented a quantitative measure of the effects of energy absorber stroke length upon cumulative probability of spinal injury for a seated subject to a vertical deceleration pulse representing a helicopter crash was calculated for a given seat and clothing weight.
Abstract: : The report presents a quantitative measure of the effects of energy absorber stroke length upon cumulative probability of injury. The response of a seated subject to a vertical deceleration pulse representing a helicopter crash was calculated for a given seat and clothing weight. The seat-man system was analytically supported by a 'square wave' energy absorber selected to generate no greater than 23 G for a 5th percentile man. The calculated response, DRI, provides a probability of spinal injury for the seated occupant. Since all parameters necessary for response computations had known statistical properties, it was possible to calculate the joint probability of injury for a particular deceleration pulse and subject weight. By calculating the statistical values for many deceleration and weight combinations, sufficient to represent the total population of both, a cumulative probability of injury was generated. Stroke length of the energy absorber required for each combination of deceleration and weight was calculated. By examining the effects of a limited stroke length, it was possible to generate a curve of stroke length available versus cumulative probability of injury. The curve indicates that for a realistic stroke length (12 inches), the cumulative probability of injury is 0.119. By doubling the stroke available or by halving it, the cumulative probability is decreased or increased by 7 percent. Comparisons with previously reported data indicate that the injury potential of the square wave is significantly higher than is theoretically achievable.

01 Jan 1972
TL;DR: An iterative procedure is presented for determining the acquisition behavior of discrete or digital implementations of a tracking loop based on the theory of Markov chains and provides the cumulative probability of acquisition in the loop as a function of time in the presence of noise and a given set of initial condition probabilities.
Abstract: An iterative procedure is presented for determining the acquisition behavior of discrete or digital implementations of a tracking loop. The technique is based on the theory of Markov chains and provides the cumulative probability of acquisition in the loop as a function of time in the presence of noise and a given set of initial condition probabilities. A digital second-order tracking loop to be used in the Viking command receiver for continuous tracking of the command subcarrier phase was analyzed using this technique, and the results agree closely with experimental data.

Journal ArticleDOI
TL;DR: In this paper, a specific design for a simple, compact, electronic random alarm mechanism is presented, which can accommodate any chosen continuous distribution of interarrival times, since these times are obtained by appropriate transformation in a device that produces the uniform probability distribution of some physical quantity.
Abstract: A random alarm is a signalling device with random interarrival times having a prescribed probability distribution. In this paper, a specific design for a simple, compact, electronic random alarm mechanism is presented. The design can accommodate any chosen continuous distribution of interarrival times, since these times are obtained by appropriate transformation in a device that produces the uniform probability distribution of some physical quantity. Nonstored-table varieties of this type may be constructed by combining an RC charging network with different types of potentiometers. A general prescription is given here for the output potential of the potentiometer as a function of the angle of rotation, where the angle is a uniformly distributed random variable. The ability of nonstored-table mechanisms to realize arbitrary distributions is dependent upon three theoretical asymptotic considerations concerning the normal distribution, the uniform distribution and independence. An empirical investigation with data from twists of a simulated potentiometer is used to verify that these results are sufficiently robust for practical application.

Journal ArticleDOI
01 Apr 1972
TL;DR: It is shown that for counts scattered about a true signal value in accord with the Poisson probability distribution, the optimum linear signal estimate is identical to the optimum nonlinear estimate if the signal has a gamma probability density.
Abstract: It is shown that for counts scattered about a true signal value in accord with the Poisson probability distribution, the optimum linear signal estimate is identical to the optimum nonlinear estimate if the signal has a gamma probability density.