scispace - formally typeset
Search or ask a question

Showing papers on "Probability distribution published in 1982"


Journal ArticleDOI
TL;DR: In this article, a unified approach to fitting two-stage random-effects models, based on a combination of empirical Bayes and maximum likelihood estimation of model parameters and using the EM algorithm, is discussed.
Abstract: Models for the analysis of longitudinal data must recognize the relationship between serial observations on the same unit. Multivariate models with general covariance structure are often difficult to apply to highly unbalanced data, whereas two-stage random-effects models can be used easily. In two-stage models, the probability distributions for the response vectors of different individuals belong to a single family, but some random-effects parameters vary across individuals, with a distribution specified at the second stage. A general family of models is discussed, which includes both growth models and repeated-measures models as special cases. A unified approach to fitting these models, based on a combination of empirical Bayes and maximum likelihood estimation of model parameters and using the EM algorithm, is discussed. Two examples are taken from a current epidemiological study of the health effects of air pollution.

8,410 citations


Book
01 Jan 1982
TL;DR: In this article, the authors present a comprehensive overview of the statistical properties of point estimates and their relationship with the probability of a given point in a single-sample set of data.
Abstract: 1. OVERVIEW AND DESCRIPTIVE STATISTICS. Populations, Samples, and Processes. Pictorial and Tabular Methods in Descriptive Statistics. Measures of Location. Measures of Variability. 2. PROBABILITY. Sample Spaces and Events. Axioms, Interpretations, and Properties of Probability. Counting Techniques. Conditional Probability. Independence. 3. DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Random Variables. Probability Distributions for Discrete Random Variables. Expected Values. The Binomial Probability Distribution. Hypergeometric and Negative Binomial Distributions. The Poisson Probability Distribution. 4. CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Probability Density Functions. Cumulative Distribution Functions and Expected Values. The Normal Distribution. The Exponential and Gamma Distributions. Other Continuous Distributions. Probability Plots. 5. JOINT PROBABILITY DISTRIBUTIONS AND RANDOM SAMPLES. Jointly Distributed Random Variables. Expected Values, Covariance, and Correlation. Statistics and Their Distributions. The Distribution of the Sample Mean. The Distribution of a Linear Combination. 6. POINT ESTIMATION. Some General Concepts of Point Estimation. Methods of Point Estimation. 7. STATISTICAL INTERVALS BASED ON A SINGLE SAMPLE. Basic Properties of Confidence Intervals. Large-Sample Confidence Intervals for a Population Mean and Proportion. Intervals Based on a Normal Population Distribution. Confidence Intervals for the Variance and Standard Deviation of a Normal Population. 8. TESTS OF HYPOTHESIS BASED ON A SINGLE SAMPLE. Hypotheses and Test Procedures. z Tests for Hypotheses About a Population Mean. The One-Sample t Test. Tests Concerning a Population Proportion. Further Aspects of Hypothesis Testing. 9. INFERENCES BASED ON TWO SAMPLES. z Tests and Confidence Intervals for a Difference between Two Population Means. The Two-Sample t Test and Confidence Interval. Analysis of Paired Data. Inferences Concerning a Difference between Population Proportions. Inferences Concerning Two Population Variances. 10. THE ANALYSIS OF VARIANCE. Single-Factor ANOVA. Multiple Comparisons in ANOVA. More on Single-Factor ANOVA. 11. MULTIFACTOR ANALYSIS OF VARIANCE. Two-Factor ANOVA with Kij = 1. Two-Factor ANOVA with Kij > 1. Three-Factor ANOVA 11. 4 2p Factorial Experiments. 12. SIMPLE LINEAR REGRESSION AND CORRELATION. The Simple Linear Regression Model. Estimating Model Parameters. Inferences About the Slope Parameter ss1. Inferences Concerning Y*x* and the Prediction of Future Y Values. Correlation. 13. NONLINEAR AND MULTIPLE REGRESSION. Assessing Model Adequacy. Regression with Transformed Variables. Polynomial Regression. Multiple Regression Analysis. Other Issues in Multiple Regression. 14. GOODNESS-OF-FIT TESTS AND CATEGORICAL DATA ANALYSIS. Goodness-of-Fit Tests When Category Probabilities Are Completely Specified. Goodness-of-Fit Tests for Composite Hypotheses. Two-Way Contingency Tables 15. DISTRIBUTION-FREE PROCEDURES. The Wilcoxon Signed-Rank Test. The Wilcoxon Rank-Sum Test. Distribution-Free Confidence Intervals. Distribution-Free ANOVA. 16. QUALITY CONTROL METHODS. General Comments on Control Charts. Control Charts for Process Location. Control Charts for Process Variation. Control Charts for Attributes. CUSUM Procedures. Acceptance Sampling.

2,313 citations


Journal ArticleDOI
TL;DR: This approach permits the impact on the option price of skewness and kurtosis of the underlying stock's distribution to be evaluated and results show how a given probability distribution can be approximated by an arbitrary distribution in terms of a series expansion involving second and higher moments.

649 citations


Journal ArticleDOI
TL;DR: The probability distribution of the phase angle between two vectors perturbed by correlated Gaussian noises is studied in detail and its asymptotic behavior for large signal-to-noise for "small," "near \pi/2 ," and "large" angles is found.
Abstract: The probability distribution of the phase angle between two vectors perturbed by correlated Gaussian noises is studied in detail. Definite integral expressions are derived for the distribution function, and its asymptotic behavior for large signal-to-noise is found for "small," "near \pi/2 ," and "large" angles. The results are applied to obtain new formulas for the symbol error rate in MDPSK, to calculate the distribution of instantaneous frequency, to study the error rate in digital FM with partial-bit integration in the postdetection filter, and to obtain a simplified expresion for the error rate in DPSK with a phase error in the reference signal. In the degenerate case in which one of the vectors is noise free, the results lead to the symbol error rate in MPSK.

452 citations


Journal ArticleDOI
TL;DR: In this paper, it is shown that the interface free energy between bulk phases with a macroscopically flat interface can be estimated from the variation of certain probability distribution functions of finite blocks with block size.
Abstract: It is suggested that the interface free energy between bulk phases with a macroscopically flat interface can be estimated from the variation of certain probability distribution functions of finite blocks with block size. For a liquid-gas system the probability distribution of the density would have to be used. The method is particularly suitable for the critical region where other methods are hard to apply. As a test case, the two-dimensional lattice-gas model is treated and it is shown that already, from rather small blocks, one obtains results consistent with the exact soluion of Onsager for the surface tension, by performing appropriate extrapolations. The surface tension of the three-dimensional lattice-gas model is also estimated and found to be reasonably consistent with the expected critical behavior. The universal amplitude of the surface tension of fluids near their critical point is estimated and shown to be in significantly better agreement with experimental data than the results of Fisk and Widom and the first-order 4-d renormalization-group expansion. Also the universal amplitude ratio used in nucleation theory near the critical point is estimated.

379 citations


Journal ArticleDOI
TL;DR: In this paper, mathematical renewal theory is used to make a general model of the combined processes of search, encounter, capture, and handling, and a model based on minimization of the probability of death due to an energetic shortfall is presented.
Abstract: Some simple stochastic models of optimal foraging are considered. Firstly, mathematical renewal theory is used to make a general model of the combined processes of search, encounter, capture and handling. In the case where patches or prey items are encountered according to a Poisson process the limiting probability distribution of energy gain is found. This distribution is found to be normal and its mean and variance are specified. This result supports the use of Holling's disc equation to specify the rate of energy intake in foraging models. Secondly, a model based on minimization of the probability of death due to an energetic shortfall is presented. The model gives a graphical solution to the problem of optimal choices when mean and variance are related. Thirdly, a worked example using these results is presented. This example suggests that there may be natural relationships between mean and variance which make solutions to the problems of ‘energy maximization’ and ‘minimization of the probability of starvation’ similar. Finally, current trends in stochastic modeling of foraging behavior are critically discussed.

343 citations


Book ChapterDOI
TL;DR: The Rational Expectation Hypothesis (REH) as discussed by the authors assumes that information exists and is available for processing by all decision makers, and that the information, consisting primarily of quantitative time series data, is a finite realization of a stochastic process, from which the probability distribution of actual outcomes today and for all future dates can be estimated.
Abstract: Proponents of the rational expectations hypothesis (hereafter REH) claim they have developed a general theory of how expectations are formed. To assure that these rational expectations generate efficient, unbiased forecasts which do not display any persistent errors when compared to the actual outcome over time, REH theorists assume that information exists and is available for processing by all decision makers. This information, consisting primarily of quantitative time series data, it is assumed, is a finite realization of a stochastic process; from this data the probability distribution of actual outcomes today and for all future dates can be estimated. Or as John Muth puts it, ‘the hypothesis can be rephrased a little more precisely as follows: that expectations of firms (or more generally, the “subjective” probability of outcomes) tend to be distributed, for the same information set, about the prediction of the theory (or the “objective” probability distribution of outcomes)’ (1961, p. 316).

326 citations


Journal ArticleDOI
TL;DR: The continuous-time random walk of Montroll and Weiss has been modified by Scher and Lax to include a coupled spatial-temporal memory as mentioned in this paper, and the asymptotic properties of the probability distribution for being at any lattice site as a function of time and its variance are calculated.
Abstract: The continuous-time random walk of Montroll and Weiss has been modified by Scher and Lax to include a coupled spatial-temporal memory. We treat novel cases for the random walk and the corresponding generalized master equation when combinations of both spatial, and temporal moments of the memory are infinite. The asymptotic properties of the probability distribution for being at any lattice site as a function of time and its variance are calculated. The resulting behavior includes localized, diffusive, wavelike, and Levy's stable laws for the appropriate scaled variable. We show that an infinite mean waiting time can lead to long time diffusive behavior, while a finite mean waiting time is not sufficient to ensure the same.

319 citations


Journal ArticleDOI
TL;DR: A [phi]-entropy functional is defined on the probability space and its Hessian along a direction of the tangent space of the parameter space is taken as the metric, and the distance between two probability distributions is computed as the geodesic distance induced by the metric.

287 citations


Journal ArticleDOI
TL;DR: In this paper, a hierarchy of equations was derived for a bed or suspension of spheres in a uniform matrix, giving the Sn in terms of the s-body distribution functions ρs associated with a statistically inhomogeneous distribution PN in the matrix.
Abstract: The microstructure of a two‐phase random medium can be characterized by a set of general n‐point probability functions, which give the probability of finding a certain subset of n‐points in the matrix phase and the remainder in the particle phase. A new expression for these n‐point functions is derived in terms of the n‐point matrix probability functions which give the probability of finding all n points in the matrix phase. Certain bounds and limiting values of the Sn follow: the geometrical interpretation of the Sn and their relationship with n‐point correlation functions associated with fluctuating bulk properties is also noted. For a bed or suspension of spheres in a uniform matrix we derive a new hierarchy of equations, giving the Sn in terms of the s‐body distribution functions ρs associated with a statistically inhomogeneous distribution PN of spheres in the matrix, generalizing expressions of Weissberg and Prager for S2 and S3. It is noted that canonical ensemble of mutually impenetrable spheres a...

280 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the only distribution of judgments which yields this type of result is the gamma distribution, which is the Dirichlet distribution of pairwise comparisons, under the assumption of total consistency.

Journal ArticleDOI
TL;DR: In this paper, the authors present elements of a frequentist theory of statistics for concepts of upper and lower (interval-valued) probability (IVP), defined on finite event algebras.
Abstract: We present elements of a frequentist theory of statistics for concepts of upper and lower (interval-valued) probability (IVP), defined on finite event algebras. We consider IID models for unlinked repetitions of experiments described by IVP and suggest several generalizations of standard notions of independence, asymptotic certainty and estimability. Instability of relative freqencies is favoured under our IID models. Moreover, generalizations of Bernoulli's Theorem give some justification for the estimation of an underlying IVP mechanism from fluctuations of relative frequencies. Our results indicate that an objectivist, frequency- or propensity-oriented, view of probability does not necessitate an additive probability concept, and that IVP models can represent a type of indeterminacy not captured by additive probability.

Journal ArticleDOI
TL;DR: In this article, a functional probability distribution on the set of trajectories which are obtained as output of the continual observation is constructed in the form of a Feynman integral, with connections with the theory of dynamical semi-groups.
Abstract: Starting from the idea of generalized observables, related to effect-valued measures, as introduced by Ludwig, some examples oi continual observations in quantum mechanics are discussed. A functional probability distribution, on the set of the trajectories which are obtained as output of the continual observation, is constructed in the form of a Feynman integral. Interesting connections with the theory of dynamical semi-groups are pointed out. The examples refer to small systems, but they are interesting for the light they may shed on the problem of the connections between the quantum and the macroscopic levels of description for a large body; the idea of continuous trajectories indeed seems to be essential for the macroscopic level of description.

Journal ArticleDOI
TL;DR: In this article, the authors introduce the notion of a field of all possible worlds and a set of all worlds in which an event takes place, which is defined as a function assigning numerical values (probabilities) to events.
Abstract: The basic concept underlying probability theory and statistics is a function assigning numerical values (probabilities) to events. An “event” in this context is any conceivable state of affairs including the so-called “empty event”—an a priori impossible state. Informally, events are described in everyday language (e.g. “by playing this strategy I shall win $1000 before going broke”). But in the current mathematical framework (first proposed by Kolmogoroff [Ko 1]) they are identified with subsets of some all-inclusive set Q. The family of all events constitutes a field, or σ-field, and the logical connectives ‘and’, ‘or’ and ‘not’ are translated into the set-theoretical operations of intersection, union and complementation. The points of Q can be regarded as possible worlds and an event as the set of all worlds in which it takes place. The concept of a field of sets is wide enough to accommodate all cases and to allow for a general abstract foundation of the theory. On the other hand it does not reflect distinctions that arise out of the linguistic structure which goes into the description of our events. Since events are always described in some language they can be indentified with the sentences that describe them and the probability function can be regarded as an assignment of values to sentences. The extensive accumulated knowledge concerning formal languages makes such a project feasible. The study of probability functions defined over the sentences of a rich enough formal language yields interesting insights in more than one direction.Our present approach is not an alternative to the accepted Kolmogoroff axiomatics. In fact, given some formal language L, we can consider a rich enough set, say Q, of models for L (called also in this work “worlds”) and we can associate with every sentence the set of all worlds in Q in which the sentence is true. Thus our probabilities can be considered also as measures over some field of sets. But the introduction of the language adds mathematical structure and makes for distinctions expressing basic intuitions that cannot be otherwise expressed. As an example we mention here the concept of a random sequence or, more generally, a random world, or a world which is typical to a certain probability distribution.

Book
01 Jan 1982
TL;DR: In this article, the authors summarize the central tendency and variability probability introduction to statistical decision analysis discrete probability distributions the normal distribution and other continuous probability distributions sampling methods and sampling distributions statistical inference estimation and hypotheses testing simple linear regression and correlation multiple and curvilinear regression chi-square tests for independence and goodness-of-fit analysis of variance nonparametric tests index numbers.
Abstract: Data analysis data summarization - frequency distributions measures of central tendency and variability probability introduction to statistical decision analysis discrete probability distributions the normal distribution and other continuous probability distributions sampling methods and sampling distributions statistical inference estimation and hypotheses testing simple linear regression and correlation multiple and curvilinear regression chi-square tests for independence and goodness-of-fit analysis of variance non-parametric tests index numbers.

Book
01 Jan 1982
TL;DR: A survey of probability concepts can be found in this article, where the central limit theorem and the Central Limit Theorem for Chi-square applications are discussed. But the main focus of this survey is on estimating confidence intervals.
Abstract: Brief Contents 1 What Is Statistics? 2 Describing Data: Frequency Distributions and Graphic Presentation 3 Describing Data: Numerical Measures 4 Describing Data: Displaying and Exploring Data 5 A Survey of Probability Concepts 6 Discrete Probability Distributions 7 Continuous Probability Distributions 8 Sampling Methods and the Central Limit Theorem 9 Estimation and Confidence Intervals 10 One-Sample Tests of Hypothesis 11 Two-Sample Tests of Hypothesis 12 Analysis of Variance 13 Linear Regression and Correlation 14 Multiple Regression and Correlation Analysis 15 Chi-Square ApplicationsAppendixesAnswers to Odd-Numbered Chapter Exercises

Journal ArticleDOI
TL;DR: A simple model of classical diffusion on a random chain is studied in this paper, where the velocities to the right and to the left are calculated and the exponent is calculated for a simple example.
Abstract: A simple model of classical diffusion on a random chain is studied. The velocities to the right and to the left are calculated. When one changes continuously the probability distribution $\ensuremath{\rho}$ of the hopping rates, a whole region is found where these two velocities vanish. In this region, the distance $R$ covered by a particle during the time $t$ behaves like $R\ensuremath{\sim}{t}^{x}$, where $x$ depends continuously on $\ensuremath{\rho}$. The exponent $x$ is calculated for a simple example.

Journal ArticleDOI
TL;DR: In this paper, the authors unify a number of results from the probability literature which enable one to prove, under very general conditions, the existence of an invariant distribution and the convergence of the corresponding Markov process.
Abstract: Equilibria in stochastic economic models are often time series which fluctuate in complex ways. But it is sometimes possible to summarize the long run, average characteristics of these fluctuations. For example, if the law of motion determined by economic interactions is Markovian and if the equilibrium time series converges in a specific probabilistic sense then the long run behavior is completely determined by an invariant probability distribution. This paper develops and unifies a number of results found in the probability literature which enable one to prove, under very general conditions, the existence of an invariant distribution and the convergence of the corresponding Markov process. VIRTUALLY ALL OF ECONOMIC THEORY focuses upon the study of economic equilibrium. This concept has recently undergone several subtle elaborations. No longer must a system of markets in equilibrium be thought of as one at rest in a static steady state. Instead there is a growing body of literature (e.g., [4, 5, 12, 16, 20, 21]) which defines equilibrium as a stochastic process of market clearing prices and quantities which is consistent with the self-interested behavior of economic agents. Needless to say equilibrium stochastic processes can be very complex time series which fluctuate in irregular ways. For theoretical and econometric purposes it is useful to have a convenient way of summarizing the "average" behavior of such processes over time. This paper draws together and unifies a number of fundamental results from the probability literature which enable one to do this for discrete time, Markov processes on general state spaces. The starting point of the analysis is a set S of economic states (e.g., prices and/or quantities). The only technical restriction placed upon S is that it be a Borel subset of a complete, separable metric space. The second datum is a transition probability P(s, ) on S. The number P(s,A) records the probability that the economic system moves from the state s to some state in the Borel subset A of S during one unit of elapsed time. In economic applications the transition probability is usually derived from hypotheses about market clearing and the maximizing behavior of economic agents. The transition probability (together with an initial probability measure on S) defines a discrete time Markov process. One way of summarizing the dynamic behavior implied by P is to look for an invariant probability. A probability measure X on S is invariant for P if for all Borel subsets A of S one has the equality f P(s, A )X(ds) = X(A). An invariant probability is a kind of probabilistic steady state for the dynamics defined by P. Of course there may be no invariant probability for P at all; and even if one exists it may convey no information about the average behavior of the process over time except under very special initial conditions. There is a second way of summarizing the behavior of Markov processes defined by the transition probability P. Let P (s,A) denote the n step transition

Journal ArticleDOI
TL;DR: In this article, a three-parameter normal tail approximation to a non-normal distribution function is proposed, where the distribution function, the probability density function and its derivative are matched at the approximation point with the approximating function.

Journal ArticleDOI
01 Mar 1982
TL;DR: In this paper, the authors examined the properties and applications of a point process that arises when each event of a primary Poisson process generates a random number of subsidiary events, with a given time course.
Abstract: Multiplication effects in point processes are important in a number of areas of electrical engineering and physics. We examine the properties and applications of a point process that arises when each event of a primary Poisson process generates a random number of subsidiary events, with a given time course. The multiplication factor is assumed to obey the Poisson probability law, and the dynamics of the time delay are associated with a linear filter of arbitrary impulse response function; special attention is devoted to the rectangular and exponential cases. The process turns out to be a doubly stochastic Poisson point process whose stochastic rate is shot noise; it has application in pulse, particle, and photon detection. Explicit results are obtained for the single and multifold counting statistics (distribution of the number of events registered in a fixed counting time), the time statistics (forward recurrence time and interevent probability densities), and the power spectrum (noise properties). These statistics can provide substantial insight into the underlying physical mechanisms generating the process. An example of the applicability of the model is provided by cathodoluminescence (the phenomenon responsible for the television image) where a beam of electrons (the primary process) impinges on a phosphor, generating a shower of visible photons (the secondary process). Each electron produces a random number of photons whose emission times are determined by the (possibly random) lifetime of the phosphor, so that multiplication effects and time delay both come into play. We use our formulation to obtain the forward-recurrence-time probability density for cathodoluminescence in YVO 4 :Eu3+, the excess cathodoluminescence noise in ZnS:Ag, and the counting distribution for radioiuminescence photons produced in a glass photomultiplier tube. Agreement with experimental data is very good in all cases. A variety of other applications and extensions of the model are considered.

Journal ArticleDOI
TL;DR: In this article, a method for selecting the member of a collection of families of distributions that best fit a set of observations is given, which is essentially the value of the density function of a scale transformation maximal invariant.
Abstract: A method is given for selecting the member of a collection of families of distributions that best fits a set of observations. This method requires a noncensored set of observations. The families considered include the exponential, gamma, Weibull, and lognormal. A selection statistic is proposed that is essentially the value of the density function of a scale transformation maximal invariant. Some properties of the selection procedures based on these statistics are stated, and results of a simulation study are reported. A set of time-to-failure data from a textile experiment is used as an example to illustrate the procedure, which is implemented by a computer program.

Journal ArticleDOI
TL;DR: In this paper, an invariant probability distribution for a class of birth-and-death processes on the integers with phases and one or two boundaries was found by solving a non-linear matrix equation and then finding a probability distribution on the boundary states.
Abstract: The invariant probability distribution is found for a class of birth-and-death processes on the integers with phases and one or two boundaries. The invariant vector has a matrix geometric form and is found by solving a non-linear matrix equation and then finding an invariant probability distribution on the boundary states. Levy's concept of watching a Markov process in a subset is used to naturally decouple the computation of distributions on the boundary and interior states.

Journal ArticleDOI
TL;DR: In this article, a connection between multivariate total positivity (TP2) and multivariate monotone likelihood ratio (MLR) for probability measures on Rn is made.
Abstract: Karlin and Rinott (1980) introduced and investigated concepts of multivariate total positivity (TP2) and multivariate monotone likelihood ratio (MLR) for probability measures on Rn These TP and MLR concepts are intimately related to supermodularity as discussed in Topkis (1968), (1978) and the FKG inequality of Fortuin, Kasteleyn and Ginibre (1971). This note points out connections between these concepts and uniform conditional stochastic order (ucso) as defined in Whitt (1980). ucso holds for two probability distributions if there is ordinary stochastic order for the corresponding conditional probability distributions obtained by conditioning on subsets from a specified class. The appropriate subsets to condition on for ucso appear to be the sublattices of Rn . Then MLR implies ucso, with the two orderings being equivalent when at least one of the probability measures is TP2.

Journal ArticleDOI
TL;DR: In this paper, a parameterization scheme for partial cloudiness is proposed to be used in the framework of higher-order models of the turbulent planetary boundary layer, based on the assumption that the total moisture and temperature fluctuations follow gamma probability density functions, which allow for a variable skewness factor, and therefore for different cloud layer regimes.
Abstract: This paper aims to develop and test a parameterization scheme for partial cloudiness, to be used in the framework of higher-order models of the turbulent planetary boundary layer. The proposed scheme is designed to be general enough and fairly accurate, although slightly at the expense of simplicity. It is based upon the assumption that the total moisture and temperature fluctuations follow gamma probability density functions, which allow for a variable skewness factor, and therefore for different cloud layer regimes. It is nevertheless believed that simpler parameterizations can be used in a number of ways, depending upon specific uses.

Journal ArticleDOI
TL;DR: It is formulated that the use of non-Gaussian statistics is necessary in the science of science and other social sciences because the sample moments appear to depend significantly on the sample size.
Abstract: Stationary distributions, i.e. distributions involving no time dependence, are considered. It is shown that all these distributions in scientometrics can be approximated by the Zipf distribution at high values of variables. The sample moments appear to depend significantly on the sample size. Accordingly, the approximation of these observational data by probability distributions converging to a stable distribution different from the normal one proves to be the only correct approximation. The conclusion is formulated that the use of non-Gaussian statistics is necessary in the science of science and other social sciences.

Journal ArticleDOI
TL;DR: In this paper, the upper bound of the probability distribution for the strength of a fibrous material is derived based on the occurrence of two or more adjacent broken fiber elements in a bundle.
Abstract: The focus of this paper is on obtaining a conservative but tight bound on the probability distribution for the strength of a fibrous material. The model is the chain-of-bundles probability model, and local load sharing is assumed for the fiber elements in each bundle. The bound is based upon the occurrence of two or more adjacent broken fiber elements in a bundle. This event is necessary but not sufficient for failure of the material. The bound is far superior to a simple weakest link bound based upon the failure of the weakest fiber element. For large materials, the upper bound is a Weibull distribution, which is consistent with experimental observations. The upper bound is always conservative, but its tightness depends upon the variability in fiber element strength and the volume of the material. In cases where the volume of material and the variability in fiber strength are both small, the bound is believed to be virtually the same as the true distribution function for material strength. Regarding edge effects on composite strength, only when the number of fibers is very small is a correction necessary to reflect the load-sharing irregularities at the edges of the bundle.

Journal ArticleDOI
TL;DR: A seven-parameter family of bivariate probability distributions is developed which allows for any gamma marginal distributions, any associated correlation (positive or negative), and a range of regression curves.
Abstract: A seven-parameter family of bivariate probability distributions is developed which allows for any gamma marginal distributions, any associated correlation (positive or negative), and a range of regression curves. The form of the family, which relies on the reproducibility property of the gamma distribution, is motivated by the search for tractable parameter estimation, general dependency structure, and straightforward computer sampling for simulation modeling. A modification with closed-form parameter estimation, but less general dependency structure, is also given. Finally, the use of these distributions in the form of first order autoregressive time series is discussed.

Journal ArticleDOI
TL;DR: In this paper, the diffusion coefficient of a particle performing a random walk on a lattice with random jump rates was investigated and it was shown that the mean square displacement of the particle is a linear function of time for all times when the initial probability distribution corresponds to a stationary distribution.
Abstract: We have investigated the diffusion coefficient of a particle performing a random walk on a lattice with random jump rates. The mean-square displacement of the particle is a linear function of time for all times when the initial probability distribution corresponds to a stationary distribution. We discuss the implication of this result on the description of diffusion in disordered media by averaged equations.

Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of finding maximum likelihood estimates of stochastically ordered survival functions for the cases of one survival function being fixed in advance and estimating two survival functions when the data include censored observations.
Abstract: Many times, aspects of populations exist that must satisfy a stochastic ordering requirement. Nevertheless, estimates may not bear out this stochastic ordering because of the inherent variability of the observations. This article will consider the problem of finding maximum likelihood estimates of stochastically ordered survival functions for the cases (a) one survival function being fixed in advance and (b) estimating two survival functions when the data include censored observations. A numerical example is discussed in detail to illustrate the solution to this problem.

Journal ArticleDOI
TL;DR: In this paper, a lightness scale is derived from a theoretical estimate of the probability distribution of image intensities for natural scenes, which is a scale similar to that used in photography or by the nervous system.
Abstract: A lightness scale is derived from a theoretical estimate of the probability distribution of image intensities for natural scenes. The derived image intensity distribution considers three factors: reflectance; surface orientation and illumination; and surface texture (or roughness). The convolution of the effects of these three factors yields the theoretical probability distribution of image intensities. A useful lightness scale should be the integral of this probability density function, for then equal intervals along the scale are equally probable and carry equal information. The result is a scale similar to that used in photography or by the nervous system as its transfer function.