scispace - formally typeset
Search or ask a question

Showing papers on "Probability distribution published in 1981"


Journal ArticleDOI
TL;DR: An exponential family of distributions that can be used for analyzing directed graph data is described, and several special cases are discussed along with some possible substantive interpretations.
Abstract: Directed graph (or digraph) data arise in many fields, especially in contemporary research on structures of social relationships. We describe an exponential family of distributions that can be used for analyzing such data. A substantive rationale for the general model is presented, and several special cases are discussed along with some possible substantive interpretations. A computational algorithm based on iterative scaling procedures for use in fitting data is described, as are the results of a pilot simulation study. An example using previously reported empirical data is worked out in detail. An extension to multiple relationship data is discussed briefly.

1,238 citations



Journal ArticleDOI
TL;DR: In this paper, a general probability distribution transformation is developed with which complex structural reliability problems involving non-normal, dependent uncertainty vectors can be reduced to the standard case of first-order reliability, i.e. the problem of determining the failure probability or the reliability index in the space of independent, standard normal variates.
Abstract: A general probability distribution transformation is developed with which complex structural reliability problems involving non-normal, dependent uncertainty vectors can be reduced to the standard case of first-order-reliability, i.e. the problem of determining the failure probability or the reliability index isn the space of independent, standard normal variates. The method requires the knowledge of the joint cumulative distribution function or a certain set of conditional distribution functions of the original vector. Some basic properties of the transformation are discussed. Details of the transformation technique are given. Approximations must be introduced for the shape of the safe domain such that its probability content can easily be evaluated which may involve numerical inversion of distribution functions. A suitable algorithm for computing reliability measures is proposed. The field of potential applications is indicated by a number of examples.

530 citations


Journal ArticleDOI
TL;DR: In the face of uncertainty, all available information should be used to make inferences or decisions as mentioned in this paper, when probability distributions for an uncertain quantity are obtained from experts, models, or models.
Abstract: Inferences or decisions in the face of uncertainty should be based on all available information. Thus, when probability distributions for an uncertain quantity are obtained from experts, models, or...

456 citations


Journal ArticleDOI
TL;DR: A highly efficient recursive algorithm is defined for simultaneously convolving an image (or other two-dimensional function) with a set of kernels which differ in width but not in shape, so that the algorithm generates aSet of low-pass or band-pass versions of the image.

442 citations


Journal ArticleDOI
TL;DR: The assumption that finding the consensus distribution commutes with marginalization of the distributions implies that the consensus must be found using a linear opinion pool (weighted average), provided the space being considered contains three or more points as discussed by the authors.
Abstract: Suppose a decision maker has asked a group of experts to assess subjective probability distributions over some space, and he wishes to find a consensus probability distribution as a function of these. The assumption that finding the consensus distribution commutes with marginalization of the distributions implies that the consensus must be found using a linear opinion pool (weighted average), provided the space being considered contains three or more points. Some of the consequences of this result are discussed.

195 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that the distribution function gives the proper probabilities for the position and momentum variables (actually only the former is needed) and that its connection with the wave function which it represents has the natural invariances, another one.

154 citations


Journal ArticleDOI
TL;DR: In this article, a sequence of convergent upper bounds is developed for the probability distribution of strength of composite materials, based on the occurrence ofk or more adjacent broken fibers in a bundle, an event which is necessary but not sufficient for the failure of the material.
Abstract: A sequence of convergent upper bounds is developed for the probability distribution of strength of composite materials. The analysis is based on the well-known chain-of-bundles model, and local load sharing is assumed for the nonfailed fiber elements in each bundle. The bounds are based on the occurrence ofk or more adjacent broken fibers in a bundle, an event which is necessary but not sufficient for the failure of the material. However, we find that given the loadL on the composite, some value ofk denotedk*(L) is critical in that a group of failed elements once reaching sizek*(L) will catastrophically increase in size with virtual certainty. IfL happens to be approximately the median strength of the composite, then the bound based onk =k*(L) adjacent breaks is virtually identical to the true probability distribution of composite strength; indeed, the convergence of the sequence of bounds becomes virtually complete ask exceedsk*(L). We show that the strength distribution for the composite essentially has weakest link structure in terms of a characteristic distribution functionW(x),x ≧ 0 which depends on the load sharing and on the probability distribution for fiber element strength. Typical cases are considered under a Weibull distribution for fiber strength and under a double version which has the effect of putting a ceiling on fiber strength. We show that in typical situations, predictions using the double Weibull distribution are not as one might guess, and its use is unjustified in many cases.

141 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a numerical approach to probabilistic calculations with great simplicity, power, and elegance, and an application to seismic risk assessment is outlined, where the philosophy and technique of this approach are described, some pitfalls are pointed out, and some application examples are outlined.
Abstract: If the point of view is adopted that in calculations of real-world phenomena we almost invariably have significant uncertainty in the numerical values of our parameters, then, in these calculations, numerical quantities should be replaced by probability distributions and mathematical operations between these quantities should be replaced by analogous operations between probability distributions. Also, practical calculations one way or another always require discretization or truncation. Combining these two thoughts leads to a numerical approach to probabilistic calculations having great simplicity, power, and elegance. The philosophy and technique of this approach is described, some pitfalls are pointed out, and an application to seismic risk assessment is outlined.

116 citations


Journal ArticleDOI
TL;DR: A general method to synthesize a stochastic process from its a priori given second-order statistics and some new results, including a new conjecture dealing with visual discrimination of Stochastic texture fields, are given.
Abstract: We present a general method to synthesize a stochastic process from its a priori given second-order statistics. Some new results, including a new conjecture dealing with visual discrimination of stochastic texture fields, are also given.

114 citations


Posted Content
TL;DR: The most commonly used risk-return model is the mean-variance model in which risk is measured as the variance and return by the mean of the probability distribution over outcomes as discussed by the authors.
Abstract: Two-attribute risk and return models are very popular in the economics and finance literature for analyzing decisions under uncertainty. Their popularity stems primarily from the intuitive appeal of the dichotomy into risk and return, and from the ease with which the concepts can be diagrammed in two dimensions. The most commonly used risk-return model is the mean-variance model in which risk is measured as the variance and return by the mean of the probability distribution over outcomes. The mean-variance model has a number of shortcomings which are widely known, but of particular concern here are the facts that 1) mean-variance dominance is neither necessary nor sufficient for second-degree stochastic dominance; 2) unless the form of the probability distribution is restricted, mean-variance is consistent with von Neumann-Morgenstern utility theory only if the utility function is quadratic; and 3) as Peter Fishburn has noted,


Journal ArticleDOI
TL;DR: In this article, the authors considered the Markov diffusion process ξ∈(t), transforming when ǫ = 0 into the solution of an ordinary differential equation with a turning point ℴ of the hyperbolic type.
Abstract: We consider the Markov diffusion process ξ∈(t), transforming when ɛ=0 into the solution of an ordinary differential equation with a turning point ℴ of the hyperbolic type. The asymptotic behevior as ɛ→0 of the exit time, of its expectation of the probability distribution of exit points for the process ξ∈(t) is studied. These indicate also the asymptotic behavior of solutions of the corresponding singularly perturbed elliptic boundary value problems.

Journal ArticleDOI
TL;DR: A treatment of errors in electron imaging has been developed that makes possible the systematic combination of data from several image areas for phase determination by multiple isomorphous replacement of the purple membrane.

Journal ArticleDOI
TL;DR: In this article, an upper bound on the probability distribution for the strength of composite materials is obtained based on the occurence of two or more adjacent broken fibers in a bundle, and local load sharing is assumed for the non-failed fiber elements in each bundle.
Abstract: An upper bound is obtained on the probability distribution for the strength of composite materials. The analysis is based on the chain-of-bundles probability model, and local load sharing is assumed for the nonfailed fiber elements in each bundle. The bound is based on the occurence of two or more adjacent broken fibers in a bundle. This event is necessary but not sufficient for the failure of the material. Two distributions are assumed for fiber strength: the usual Weibull distribution and a more realistic double version which has much the effect of putting a ceiling on fiber strength. For large composite materials, the upper bound becomes a Weibull distribution but with a shape parameter which is twice that for the individual fibers. The bound is always conservative, but it is extremely tight when the variability in fiber strength is low. In typical cases, the use of the double Weibull distribution for fiber strength is shown not to affect the behavior of the bound significantly. In view of the additional experimental and computational labor involved, its use in practice may not be justified in such cases. However, its use does shed light on fracture processes in composite materials.

Journal ArticleDOI
TL;DR: In this article, the effective medium approximation for uncorrelated random linear networks is studied for various probability distributions of conductance and the magnitude of the correction terms by a mixture of analytical and numerical arguments.
Abstract: The author studies the effective medium approximation for uncorrelated random linear networks. For various probability distributions of conductance compares the effective medium average value to the direct numerical average and study the magnitude of the correction terms by a mixture of analytical and numerical arguments. He argues that for all 'non-critical' conductance distributions-for systems not dominated by very low values of conductance and not near the conduction threshold-the effective medium approximation is quite accurate, with errors of order 1%. For any conduction distribution, effective medium theory is shown to be exact in the two limits of coordination number sigma =2 and sigma to infinity . Some remarks on the critical path approximation are presented.

Journal ArticleDOI
TL;DR: In this paper, a real-space renormalization group transformation of the free energy is studied, including the existence of oscillatory terms multiplying the non-analytic part of free energy.
Abstract: We review the properties of a real-space renormalization group transformation of the free energy, including the existence of oscillatory terms multiplying the non-analytic part of the free energy. We then construct stochastic processes which incorporate into probability distributions the features of the free energy scaling equation. (The essential information is obtainable from the scaling equation and a direct solution for a probability is not necessary.) These random processes are shown to be generated directly from Cantor sets. In a spatial representation, the ensuing random process exhibits a transition between Gaussian and fractal behavior. In the fractal regime, the trajectories will, in an average sense, form self-similar clusters. In a temporal representation, the random process exhibits a transition between an asymptotically constant renewal rate and fractal behavior. The fractal regime represents a frozen state with only transient effects allowed and is related to charge transport in glasses.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the distribution function obtained by smoothing the original distribution function with a ground-state harmonic oscillator wave function and derive its time dependence and show that the field-free result does not correspond to the classical result.


Journal ArticleDOI
TL;DR: This paper reviewed the literature relevant to opinion aggregation when forecasts are expressed as subjective probability distributions and described an experimental comparison of the four procedures listed above using a subjective probability forecasting task and concluded that subjective probability forecasts can be substantially improved by aggregating the opinions of a group of experts rather than relying on a single expert.

Journal ArticleDOI
TL;DR: In this paper, various steps involved in the estimation of an extreme design wave are reviewed, including methods of data collection, the use of a plotting formula, selection of a suitable distribution and its parameters, the plotting of confidence bands about the best-fit line, and the selection of the design wave corresponding to a prescribed return period or encounter probability.
Abstract: The various steps involved in the estimation of an extreme design wave are reviewed. These include the following: methods of data collection, the use of a plotting formula, the selection of a suitable distribution and its parameters, the plotting of confidence bands about the best-fit line, and the selection of a design wave corresponding to a prescribed return period or encounter probability. The probability distributions commonly used, including the log-normal and Extremal Types I, II, and III distributions (Gumbel, Fretchet, and Weibull respectively), are described, and their properties are summarized in tabular form for ready reference.

Journal Article
TL;DR: This article showed that rank and frequency have a surprisingly constrained relationship in natural language corpora, and that rank-ordered frequency counts of the words in the corpus serve moderately well as an illustration of the constrained relationship.
Abstract: 1Let us start by considering a basic form of Zipf's law. Suppose one has a natural-language corpus, e.g., a book written in English. Next, suppose one makes a frequency count of the words in the corpus, i.e., counts the number of occurrences of the, and, of, etc. Finally, suppose one arranges the words in decreasing order of frequency so that the most frequent word has rank 1; the next most frequent, rank 2; and so on. For example, a frequency count of the 75 word-types (i.e., dictionary entries) represented by the 112 word-tokens (i.e., distinct occurrences) in the two preceding paragraphs yields the partial results shown in table 1. This set of rank-ordered frequency counts, though quite small for the purpose, serves moderately well as an illustration of the fact that rank and frequency have a surprisingly constrained relationship in natural-language corpora. The values of the products of rank r and frequency f fall in the relatively limited range 27-30 in the middle of table 1, and we may note that there was no a priori reason for us to expect that the middle products rf would fall within so limited a range.

Journal ArticleDOI
TL;DR: The relationship between the instantaneous frequency and the mean frequency of ultrasonic Doppler signals in blood velocity measurements and the effect of hard limiting of the signal on the estimator performance is discussed.
Abstract: The relationship between the instantaneous frequency and the mean frequency of ultrasonic Doppler signals in blood velocity measurements is discussed. The probability distribution density for the instantaneous frequency is calculated. The time interval histogram (TIH), which has been used to characterize the Doppler signal, is found to be an approximation of this probability density. The probability density will also describe the output of phase-lock loop analysis of the Doppler signal. The variance of mean frequency estimators is calculated, and the implication of this for practical estimators is discussed. The effect of hard limiting of the signal on the estimator performance is also discussed.

Journal ArticleDOI
TL;DR: In this paper, the dynamic climatology of a simple model of barotropic stochastically forced β-plane flow over topography is studied, where a random forcing is added in order to incorporate crudely the impact of the truncated flow modes on those retained in the model.
Abstract: The dynamic climatology of a simple model of barotropic stochastically forced β-plane flow over topography is studied. Except for the forcing, the model is similar to the three-component systems studied by Charney and DeVore (1979) and Hart (1979). In certain regions of parameter space there are two stable equilibria, a high-index flow with strong zonal winds and a low-index flow with a pronounced wave component. A random forcing is added in order to incorporate crudely the impact of the truncated flow modes on those retained in the model. The Fokker-Planck equation for this system is solved numerically and the steady-state probability distribution of the system is evaluated. It is found that the probability density distribution has maxima at the equilibria but that there also is a finite probability for intermediate states. This situation corresponds to that in the atmosphere where certain types of circulation like a high-index flow are met more frequently than others. It is also found that the ...

Journal ArticleDOI
TL;DR: A characterization theorem shows that a Pareto distribution of the second kind or equivalently, a gamma mixture of exponentials is the unique distribution with a linearly increasing median residual lifetime.
Abstract: The concept of the median residual lifetime for a probability distribution is outlined and shown to have several advantages over its more common counterpart, the mean residual lifetime. A characterization theorem shows that a Pareto distribution of the second kind or equivalently, a gamma mixture of exponentials is the unique distribution with a linearly increasing median residual lifetime. An empirical example from the literature on strike durations is presented.

Journal ArticleDOI
TL;DR: In this paper, a general sufficient condition for stochastic ordering of probability distributions is proved and the question of existence of upper and lower bounds in the class of all distributions with given marginals is considered.
Abstract: Some characterizations for the stochastic ordering of probability distributions are given. Especially a general sufficient condition for stochastic ordering is proved and the question of existence of upper and lower bounds in the class of all distributions with given marginals is considered. Stochastic ordering is applied to prove a general theorem on monotonicity of the OC-function of sequential probability ratio tests in stochastic processes.


Journal ArticleDOI
TL;DR: In this article, the authors present an asymptotic probability distribution for the passage time of a representative observable passing through a fixed threshold value, which is valid when the threshold is set sufficiently far from the initial state.
Abstract: The decay of unstable equilibrium states is accompanied by large-scale fluctuations. The statistical properties of such processes can be characterized by using the time at which a representative observable first passes through a fixed threshold value. We present an asymptotic probability distribution for that passage time which is valid when the threshold is set sufficiently far from the initial state. For the simplest example of linear isotropic amplification of an $n$-component vector we calculate both the exact first-passage time distribution and our asymptotic distribution. We verify that the asymptotic distribution coincides with the exact one in the appropriate limit. We then evaluate our asymptotic distribution for a number of more complicated systems including one in which an $n$-component vector field in $d$ spatial dimensions departs from an unstable equilibrium state. The resulting expression has a considerable degree of universality. Its form is independent of $d$ and of details of the field dynamics. It is insensitive, in particular, to whether the underlying field considered is conserved or not. Our procedure is applicable to a wide variety of problems in which an order parameter departs spontaneously from an unstable initial value.

Journal ArticleDOI
TL;DR: The Neyman type-A and Thomas counting distributions turn out to provide a good description for the counting of photons generated by multiplied Poisson processes, as long as the time course of the multiplication is short compared with the counting time.
Abstract: The Neyman type-A and Thomas counting distributions provide a useful description for a broad variety of phenomena from the distribution of larvas on small plots of land to the distribution of galaxies in space. They turn out to provide a good description for the counting of photons generated by multiplied Poisson processes, as long as the time course of the multiplication is short compared with the counting time. Analytic expressions are presented for the probability distributions, moment generating functions, moments, and variance-to-mean ratios. Sums of Neyman type-A and Thomas random variables are shown to retain their form under the constraint of constant multiplication parameter. Conditions under which the Neyman type-A and Thomas converge in distribution to the fixed multiplicative Poisson and to the Gaussian are presented. This latter result is most important for it provides a ready solution to likelihood-ratio detection, estimation, and discrimination problems in the presence of many kinds of signal and noise. The doubly stochastic Neyman type-A, Thomas, and fixed multiplicative Poisson distributions are also considered. A number of explicit applications are presented. These include (1) the photon counting scintillation detection of nuclear particles, when the particle flux is low, (2) the photon counting detection of weak optical signals in the presence of ionizing radiation, (3) the design of a star-scanner spacecraft guidance system for the hostile environment of space, (4) the neural pulse counting distribution in the cat retinal ganglion cell at low light levels, and (5) the transfer of visual signal to the cortex in a classical psychophysics experiment. A number of more complex contagious distributions arising from multiplicative processes are also discussed, with particular emphasis on photon counting and direct-detection optical communications.

Journal ArticleDOI
TL;DR: In this paper, a method for computing the probability distribution of combined, time varying loading on a structure is presented, based on finding a close upper bound to the mean upcrossing rate of a linear combination of scalar load processes.
Abstract: A method for computing the probability distribution of combined, time varying loading on a structure is presented. The point crossing method is based on finding a close upper bound to the mean upcrossing rate of a linear combination of scalar load processes. Both smooth, continuous processes and pulse-type processes are included. The solution is a series of single convolutions of two natural descriptors of the load effect process: the first-order or arbitrary-point-in-time probability distribution and the mean upcrossing rate function. An example of the combination of three load processes is included, and the method is compared with others. Implications for load modeling and a format for combined loads in deterministic design codes are analyzed.