scispace - formally typeset
Search or ask a question

Showing papers on "Probability distribution published in 1980"


Journal ArticleDOI
TL;DR: In this paper, the authors developed a generalized PDF called double bounded probability density function (DB-PDF) for parameter estimation and simulation of random variables, which can be applied to practical problems of parameter estimation.

799 citations


Journal ArticleDOI
TL;DR: In this article, a uniform asymptotic series for the probability distribution of the sum of a large number of independent random variables is derived, which is based on the fact that the major components of the distribution are determined by a saddle point and a singularity.
Abstract: In the present paper a uniform asymptotic series is derived for the probability distribution of the sum of a large number of independent random variables. In contrast to the usual Edgeworth-type series, the uniform series gives good accuracy throughout its entire domain. Our derivation uses the fact that the major components of the distribution are determined by a saddle point and a singularity at the origin. The analogous series for the probability density, due to Daniels, depends only on the saddle point. Two illustrative examples are presented that show excellent agreement with the exact distributions.

696 citations


Journal ArticleDOI
TL;DR: In this article, a scaling theory based on the conductivity of a system of random elastic scatterers in terms of its scattering properties at a fixed energy was proposed. But it was shown that scaling leads to a wellbehaved probability distribution of this variable and to a simple scaling law not previously given in the literature.
Abstract: We base a scaling theory of localization on an expression for conductivity of a system of random elastic scatterers in terms of its scattering properties at a fixed energy. This expression, proposed by Landauer, is first derived and generalized to a system of indefinite size and number of scattering channels (a "wire"), and then an exact scaling theory for the one-dimensional chain is given. It is shown that the appropriate scaling variable is $f(\ensuremath{\rho})=\mathrm{ln}(1+\ensuremath{\rho})$ where $\ensuremath{\rho}$ is the dimensionless resistance, which has the property of "additive mean," and that scaling leads to a well-behaved probability distribution of this variable and to a very simple scaling law not previously given in the literature.

691 citations


Journal ArticleDOI
Kenichi Nanbu1
TL;DR: In this article, the authors proposed a method to determine the velocities of simulated molecules after a small time increment was derived from the Boltzmann equation, which was shown to give an exact solution of the Boltzman equation.
Abstract: The stochastic law that prescribes the velocities of simulated molecules after a small time increment was derived from the Boltzmann equation. The scheme to determine the velocity of a molecule after a small time increment is divided into three steps. The first step gives the collision probability of the molecule without specifying its collision partner. The second step gives a conditional probability distribution. If the molecule is accepted in the first step as a colliding molecule, its collision partner is sampled from this probability distribution. The last step gives a probability density from which the direction of the relative velocity after collision is sampled, and hence the step gives the post-collision velocity of the molecule. It is shown that the use of the present simulation scheme gives an exact solution of the Boltzmann equation.

360 citations


Journal ArticleDOI
TL;DR: The theory and applicability of such hyperbolic distributions have been the subject of a number of recent investigations and the purpose of the present paper is to summarize these developments, with regard to the interest they may have to sedimentologists.
Abstract: The pattern of empirical distributions, in particular size distributions, is often best brought out by drawing a log-histogram The Gaussian or ‘normal’ distribution furnishes a description of the empirical distribution if the log-histogram approximates to a parabola In many cases, however, the log-histogram is far from parabolic but may be closely approximated by a hyperbola It is therefore natural to consider those theoretical probability distributions for which the graph of the log-probability (density) function is a hyperbola The theory and applicability of such hyperbolic distributions have been the subject of a number of recent investigations and it is the purpose of the present paper to summarize these developments, with regard to the interest they may have to sedimentologists A precise description of the hyperbolic distributions is given and their wide applicability is indicated Methods for fitting these distributions to data are discussed and a number of sedimentological examples are presented Furthermore, the question of finding dynamical explanations for the occurrence of the hyperbolic shape is considered from various points of view

220 citations


Journal ArticleDOI
TL;DR: The results are compared with previous work on probabilistic relaxation labeling, and examples are given from the image segmentation domain, to applications of the new scheme in text processing.
Abstract: Let a vector of probabilities be associated with every node of a graph. These probabilities define a random variable representing the possible labels of the node. Probabilities at neighboring nodes are used iteratively to update the probabilities at a given node based on statistical relations among node labels. The results are compared with previous work on probabilistic relaxation labeling, and examples are given from the image segmentation domain. References are also given to applications of the new scheme in text processing.

201 citations


Journal ArticleDOI
TL;DR: The Bayesian rule of conditionalisation, as well as its extension by R. C. Jeffrey, will be exhibited as special cases and general conditions under which it yields a unique prescription will also be studied.
Abstract: The use of the principle of minimum information, or equivalently the principle of maximum entropy, has been advocated by a number of authors over recent years both in statistical physics as well as more generally in statistical inference' It has perhaps not been sufficiently appreciated by philosophers, however, that this principle, when properly understood, affords a rule of inductive inference of the widest generality2 The purpose of this paper is to draw attention to the generality of the principle Thus the Bayesian rule of conditionalisation, as well as its extension by R C Jeffrey, will be exhibited as special cases General conditions under which it yields a unique prescription will also be studied Detailed treatment will be restricted to the finite-dimensional case but an outline of the general case is given in the Appendix The underlying idea of maximum entropy inference is this Suppose P to be a probability distribution assigning probabilities px, , p, to n mutually exclusive and jointly exhaustive events Then the information-theoretic entropy of the distribution is defined by

192 citations


Book
01 Aug 1980
TL;DR: This chapter discusses random Variables, Random Variables and Probability Distributions, and nonparametric tests of Hypotheses and Significance in the context of Statistics.
Abstract: Part I: Probability Chapter 1: Basic Probability Chapter 2: Random Variables and Probability Distributions Chapter 3: Mathematical Expectation Chapter 4: Special Probability Distributions Part II: Statistics Chapter 5: Sampling Theory Chapter 6: Estimation Theory Chapter 7: Tests of Hypotheses and Significance Chapter 8: Curve Fitting, Regression, and Correlation Chapter 9: Analysis of Variance Chapter 10: Nonparametric Tests Appendices Index Index for Solved Problems

176 citations



Journal ArticleDOI
TL;DR: In this paper, an approximate method for determining estimates of stationary response statistics for a non-linear oscillator driven by wide-band random excitation is described, where the differential equation of the oscillator is used to generate relations between unknown response statistics.
Abstract: An approximate method for determining estimates of stationary response statistics for a non-linear oscillator driven by wide-band random excitation is described. The differential equation of the oscillator is used to generate relations between unknown response statistics. These relations are then used to fix a corresponding number of unknown parameters in a non-Gaussian probability distribution for the response. The method is illustrated by application to the Duffing oscillator. Elementary methods for evaluating the necessary correlations between excitation and response are discussed in an Appendix.

138 citations


Journal ArticleDOI
TL;DR: In this paper, the surface elevation probability density function and associated statistical properties for a wind-generated wave field were compared with the results derived from the Longuet-Higgins (1963) theory.
Abstract: Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

Journal ArticleDOI
TL;DR: The modified coefficient of variation as discussed by the authors measures the inequality of publication and citation counts and is used to measure the scale invariance, equal sensitivity at all levels of the distribution, and a close relationship to a probability distribution which fits observed productivity distributions.
Abstract: A new statistic — the modified coefficient of variation - is proposed for measuring the inequality of publication and citation counts. Its advantages include scale invariance, equal sensitivity at all levels of the distribution, allowance for random variation in publication or citation counts, and a close relationship to a probability distribution which fits observed productivity distributions. Also discussed are disciplinary differences in the distribution of productivity and the functional relationship between productivity and its determinants.

Journal ArticleDOI
TL;DR: The results show that the class of probability distributions on a cPO is itself a cpo and that every probability distribution is the lub of an increasing sequence of ‘finite’ probability distributions.

Journal ArticleDOI
Akira Kimura1
29 Jan 1980
TL;DR: In this paper, the statistical properties of the group formation of random waves determined by the zero-up-cross method were analyzed and very good agreement between data and the theoretical distributions have been obtained.
Abstract: This study deals with the statistical properties of the group formation of random waves determined by the zero-up-cross method. Probability distributions about (1) the run of high waves (2) the total run (3) the run of resonant wave period are derived theoretically providing that the time series of wave height and wave period form the Markov chain. Transition probabilities are given by the 2-dimensional Rayleigh distribution for the wave height train and the 2-dimensional Weibull distribution for the wave period train. And very good agreements between data and the theoretical distributions have been obtained. Then the paper discusses those parameters which affect the statistical properties of the runs and shows that the spectrum peakedness parameter for the. run of wave height and the spectrum width parameter for the run of wave period are the most predominant.

Journal ArticleDOI
TL;DR: In this article, the probability distribution functions between an endpoint and an interior point, and between two interior points, by using exact enumeration to study a lattice self-avoiding walk model, were calculated for a polymer chain in a good solvent.
Abstract: For a polymer chain in a good solvent, the author calculates the probability distribution functions between an endpoint and an interior point, and between two interior points, by using exact enumeration to study a lattice self-avoiding walk model. These distribution functions are different from the usual distribution function between endpoints. At small distance scales, the probability of nearest-neighbour contacts between two interior points is smaller than the probability of contact between two endpoints. FCC and triangular lattices are considered.

Journal ArticleDOI
TL;DR: The authors considered procedures for combining individual probability distributions that belong to some "family" into a "group" probability distribution that belongs to the same family, and applied these results to models of reaction time in psychological experiments.

Journal ArticleDOI
TL;DR: In this article, a computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight, where Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, and minimum relative humidity and total solar radiation.
Abstract: A computer simulation model was constructed to supply daily weather data to a plant disease management model for potato late blight. In the weather model Monte Carlo techniques were employed to generate daily values of precipitation, maximum temperature, minimum temperature, minimum relative humidity and total solar radiation. Each weather variable is described by a known theoretical probability distribution but the values of the parameters describing each distribution are dependent on the occurrence of rainfall. Precipitation occurrence is described by a first-order Markov chain. The amount of rain, given that rain has occurred, is described by a gamma probability distribution. Maximum and minimum temperature are simulated with a trivariate normal probability distribution involving maximum temperature on the previous day, maximum temperature on the current day and minimum temperature on the current day. Parameter values for this distribution are dependent on the occurrence of rain on the previou...

Book ChapterDOI
01 Jan 1980
TL;DR: It does seem to me worth noting that if P is a probability distribution, and if for any A and B, P B (A) = P(B > A), then P B is a probabilistic distribution, excepting the absurd case as mentioned in this paper.
Abstract: It does seem to me worth noting that if P is a probability distribution, and if for any A and B, P B (A) = P(B >A), then P B is a probability distribution too (excepting the absurd case). What it is good for, I would like to suggest, is deliberation — the calculation of expected utilities.

Journal ArticleDOI
TL;DR: In this paper, the low moments of the resistance for models of random one-dimensional conductors are given, where the fluctuations in resistance grow with length more rapidly than the average resistance, so that the latter is not representative of the probability distribution.
Abstract: Exact calculations of the low moments of the resistance for models of random one-dimensional conductors are given. The fluctuations in resistance grow with length more rapidly than the average resistance, so that the latter is not representative of the probability distribution. The probability distribution is calculated in the limit of large disorder.

Journal ArticleDOI
TL;DR: The present paper gives some more exact relations between complexity and randomness and one can only hope that when the theory using general computability will be more perfect then the chances to find its practical extension increase.
Abstract: For a computable probability distribution P over the set of infinite binary sequences Martin-Lof defined a test d(x|P) measuring the degree of nonrandomness of the sequence x with respect of P. We give some expressions in terms of Kolmogorov's and other complexities of the initial segments of x whose difference from d(x|P) is bounded by a constant.

Journal Article
TL;DR: Molecular descriptions of several common sample designs used to estimate the distribution parameters are developed here because failure to describe the sample space adequately can lead to erroneous genetic analyses.
Abstract: A necessary item of information in many genetic analysis of complex disorders with late onset is the cumulative probability of onset by a given age. The effect of sample design upon the estimation of age-of-onset probability distribution parameters is discussed. Mathematical descriptions of several common sample designs used to estimate the distribution parameters are developed here. Failure to describe the sample space adequately can lead to erroneous genetic analyses because the cumulative probability of onset is incorrectly estimated. In genetic counseling, the errors would usually result in an underestimate of the true risk.

Journal ArticleDOI
TL;DR: The computer assisted search planning (CASP) system developed for the United States Coast Guard as mentioned in this paper is based on Monte Carlo simulation to obtain an initial probability distribution for target location and to update this distribution to account for drift due to currents and winds.
Abstract: This paper provides an overview of the Computer-Assisted Search Planning (CASP) system developed for the United States Coast Guard. The CASP information processing methodology is based upon Monte Carlo simulation to obtain an initial probability distribution for target location and to update this distribution to account for drift due to currents and winds. A multiple scenario approach is employed to generate the initial probability distribution. Bayesian updating is used to reflect negative information obtained from unsuccessful search. The principal output of the CASP system is a sequence of probability “maps” which display the current target location probability distributions throughout the time period of interest. CASP also provides guidance for allocating search effort based upon optimal search theory.

Journal ArticleDOI
TL;DR: A method is presented which provides a criterion for detecting a change in the structure of a model generating a stochastic sequence based on the transformation of the observed sequence into a sequence of partial sums of the general innovations.
Abstract: A method is presented which provides a criterion for detecting a change in the structure of a model generating a stochastic sequence. Models that can be represented by a sequence of predictive probability distributions are considered. The method is based on the transformation of the observed sequence \{x_{n}\} into a sequence of partial sums of the general innovations, computed for the sequence \{-\log f(x_{n}|x_{n-1},x_{n-2}, \cdots ,x_{0})\} . If no change occurs the transformed sequence behaves like a Wiener process, but its mean will exhibit a monotonic growth after the process changes. Based on the properties of this transformation, fixed sample size and sequential tests for the change are constructed. The technique is applied to test for a change in the mean vector in a sequence of (generally dependent) Gaussian random variables, a change of coefficients of an autoregressive process, and a change of distribution in a sequence of discrete independent identically distributed random variables.

Journal ArticleDOI
01 Mar 1980
TL;DR: In this paper, two new methods are presented for the estimation of the frequencies of closely spaced complex valued sinusoidal signals in the presence of noise, and the most effective method is a computationally efficient method for realization of maximum likelihood or maximum posterior probability estimates of the frequency.
Abstract: Two new methods are presented for the estimation of the frequencies of closely spaced complex valued sinusoidal signals in the presence of noise. The most effective method is a computationally efficient method for realization of maximum likelihood or maximum posterior probability estimates of the frequencies. The second method is a class of algorithms for removing some of the deficiencies of present adaptive filtering and correlation-estimation approaches to estimation of frequencies, such as the forward-backward linear prediction method. In both of these new methods one is fitting a signal model to data. In method 1 the data are the observed samples of two complex sinusoids plus noise. In the second method the data are elements of an estimated correlation matrix, or of some of its eigenvectors, obtained from the observed samples.

Journal ArticleDOI
TL;DR: In this paper, an improved geometric-series method is presented for converting continuous time models to equivalent discrete time models, and a direct truncation method, a matrix continued fraction method and a geometric series method are presented.

Journal ArticleDOI
TL;DR: In this paper, a new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies and an efficient computational strategy is proposed for random variate generation.
Abstract: A new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies. The distribution includes the exponential power family as a special case. An efficient computational strategy is proposed for random variate generation. An example for testing the hypothesis of unit variance illustrates the advantages of the proposed distribution.

Journal ArticleDOI
TL;DR: This paper generalizes certain analytical formulas for yield and yield sensitivities so that design centering and yield optimization can be effectively carried out employing given statistical parameter distributions.
Abstract: This paper generalizes certain analytical formulas for yield and yield sensitivities so that design centering and yield optimization can be effectively carried out employing given statistical parameter distributions. The tolerance region of possible outcomes is discretized into a set of orthotopic cells. A suitable weight is assigned to each cell in conjunction with an assumed uniform distribution on the cell. Explicit formulas for yield and its sensitivities w.r.t. nominal parameter values and component tolerances are presented for linear cuts and sensitivities of these cuts based upon approximations of the boundary of the constraint region. To avoid unnecessary evaluations of circuit responses, e.g., integrations for nonlinear circuits, multidimensional quadratic interpolation is performed. Sparsity is exploited in the determination of these quadratic models leading to reduced computation as well as increased accuracy.

Journal ArticleDOI
TL;DR: In this article, the authors describe a method by which a variety of statistical design problems can be solved by a linear program, which is based on the correspondence between the level contours of a given probability density function and a particular norm, which they call a pdf-norm.
Abstract: We describe a method by which a variety of statistical design problems can be solved by a linear program. We describe three key aspects of this approach. 1) The correspondence between the level contours of a given probability density function and a particular norm, which we shall call a pdf-norm. 2) The expression of distance in this norm from a given set of hyperplanes in terms of the dual of the pdf-norm. 3) The use of a linear program to inscribe a maximal pdf-norm-body into a simplicial approximation to the feasible region of a given statistical design problem. This work thus extends the applicability of a previously published algorithm, to the case of arbitrary pdf-norms and consequently to a wide variety of statistical design problems including the common mixed worstcase-yield maximization problem.

Journal ArticleDOI
Hideo Aoki1
TL;DR: In this article, a real-space renormalization-group method is developed for electron systems, where the equation for the Green function is reduced to that for the green function in a decimated space to obtain the renormalised Hamiltonian.
Abstract: A new real-space renormalisation-group method is developed for electron systems. The equation for the Green function is reduced to that for the Green function in a decimated space to obtain the renormalised Hamiltonian. This formalism is applied to the two-dimensional disordered electron systems in the Anderson model. It is shown that the whole probability distribution Pv(Vij(n); mod ri-rj mod ), of the renormalised transfer energies, Vij(n), can be characterised by gaussian distributions in the logarithmic scale. In particular the average of Vij(n) behaves as (Vjj(n))2)1/2 varies as exp(- gamma (n)(E) mod ri-rj mod ), where the decay coefficient gamma (n)(E) tends to zero as the author approaches the extended regime. The transformation property of the system against decimation is discussed in terms of the flow diagram of the probability distribution functions for the renormalised Hamiltonian.

Journal ArticleDOI
TL;DR: In this article, the authors measured the probability distributions of wind pressures acting on points or areas of low-rise building models in turbulent boundary layer flow simulating wind over open country and suburban terrains.
Abstract: Probability distributions of wind pressures acting on points or areas of low-rise building models have been measured in turbulent boundary layer flow simulating wind over open country and suburban terrains. Results show extensive positive and negative tails for high negative and positive mean pressure coefficients, respectively. Probability density functions of wind speed have also been found positively shewed at lower heights. It is shown that the Gaussian distribution underestimates the high-pressure peak factors, whereas a Weibull model fits the data adequately. Analytical expressions for the number of upcrossings of any pressure level and the extreme value distribution are presented. Comparison with experimental data shows a good agreement.