scispace - formally typeset
Search or ask a question

Showing papers on "Poisson distribution published in 1987"


Journal ArticleDOI
27 Feb 1987-Science
TL;DR: A novel foam structure is presented, which exhibits a negative Poisson's ratio, and such a material expands laterally when stretched, in contrast to ordinary materials.
Abstract: A novel foam structure is presented, which exhibits a negative Poisson's ratio. Such a material expands laterally when stretched, in contrast to ordinary materials.

2,871 citations


Book
01 Sep 1987
TL;DR: This book discusses Random Variables and Their Probability Distribution, Quality Control Charts and Acceptance Sampling, and Hypothesis Tests for Independence and Goodness-of-Fit.
Abstract: Diagrams and Tables. Measures of Location. Measures of Dispersion and Skewness. Basic Ideas of Probability. Random Variables and Their Probability Distribution. Some Standard Discrete and Continuous Probability Distributions. Approximations to the Binomial and Poisson Distributions. Linear Functions of Random Variables and Joining Distributions. Sample Populations and Point Estimation. Interval Estimation. Hypothesis Tests for the Mean and Variance of Normal Distributions. Hypothesis Tests for the Binomial Parameter p,p. Hypothesis Tests for Independence and Goodness-of-Fit. Non-Parametric Hypothesis Tests. Correlation. Regression. Elements of Experimental Design and Analysis. Quality Control Charts and Acceptance Sampling.

667 citations


Journal ArticleDOI
TL;DR: In this paper, the authors focus on situations where individuals can experience repeated events, and data on an individual consist of the number and occurrence times of events, along with concomitant variables.
Abstract: This article is directed toward situations where individuals can experience repeated events, and data on an individual consist of the number and occurrence times of events, along with concomitant variables. Methods of regression analysis are presented, based on Poisson process and proportional intensity assumptions. These include parametric and semi-parametric approaches to model fitting, model assessment, and the treatment of random effects. In addition, insight is gained as to the central role of Poisson and mixed Poisson regression analysis of counts in these methods, and as to the effects of unobserved heterogeneity on semi-parametric analyses. The methods in the article are based on the proportional intensity Poisson process model, for which an individual with given fixed covariate vector x has repeated events occur according to a nonhomogeneous Poisson process with intensity function λx(t) = λ0(t)exp(x′β). Estimation of β and the baseline intensity λ0(t) are considered when λ0(t) is specifi...

342 citations


Journal ArticleDOI
TL;DR: In this article, a model for estimating an origin-destination matrix from an observed sample matrix, when the volumes on a subset of the links of the network and/or the total productions and attractions of the zones are known, is described.
Abstract: We describe a model for estimating an origin-destination matrix from an observed sample matrix, when the volumes on a subset of the links of the network and/or the total productions and attractions of the zones are known. The elements of the observed sample matrix are assumed to be integers that are obtained from independent Poisson distributions with unknown means. A maximum likelihood model is formulated to estimate these means, yielding an estimation of the “true” origin- destination matrix which is consistent with the observed link volumes. Conditions for existence and uniqueness of a solution are discussed. A solution algorithm based on the cyclic coordinate descent method is developed and its convergence properties are analyzed. The special case of the matrix estimation problem, in which marginal totals are given instead of link volumes, is considered separately; a numerical example is used to illustrate the problem. Using results about the asymptotic behavior of the distribution of the likelihood function, tests may be derived that allow statistical inferences on the consistency of the available data. Finally, an extension of the model is studied in which the observed volumes are Poisson-distributed as well.

310 citations


Journal ArticleDOI
TL;DR: In this article, the intervals between events are modeled as iid exponential (λ i, or the counts as Poisson (λ I t i,) for the ith item, and each individual rate parameter, λ i, is presumed drawn from a fixed (super) population with density g λ (·; θ), θ being a vector parameter.
Abstract: A collection of I similar items generates point event histories; for example, machines experience failures or operators make mistakes. Suppose the intervals between events are modeled as iid exponential (λ i , or the counts as Poisson (λ i t i ,) for the ith item. Furthermore, so as to represent between-item variability, each individual rate parameter, λ i , is presumed drawn from a fixed (super) population with density g λ (·; θ), θ being a vector parameter: a parametric empirical Bayes (PEB) setup. For g λ, specified alternatively as log-Student t(n) or gamma, we exhibit the results of numerical procedures for estimating superpopulation parameters ll and for describing pooled estimates of the individual rates, λ i , obtained via Bayes's formula. Three data sets are analyzed, and convenient explicit approximate formulas are furnished for λ i estimates. In the Student-t case, the individual estimates are seen to have a robust quality.

134 citations


Journal ArticleDOI
TL;DR: In this paper, an iterative statistical procedure is developed for fitting Markov-modulated Poisson processes (MMPP) having two arrival rates to observational data, motivated by maximum likelihood estimation.

110 citations


Journal ArticleDOI
TL;DR: An approximation is given that is an asymptotic upper bound, easy to compute, and, for the purposes of hypothesis testing, more accurate than other approximations presented in the literature.
Abstract: The scan statistic evaluates whether an apparent cluster of disease in time is due to chance. The statistic employs a 'moving window' of length w and finds the maximum number of cases revealed through the window as it scans or slides over the entire time period T. Computation of the probability of observing a certain size cluster, under the hypothesis of a uniform distribution, is infeasible when N, the total number of events, is large, and w is of moderate or small size relative to T. We give an approximation that is an asymptotic upper bound, easy to compute, and, for the purposes of hypothesis testing, more accurate than other approximations presented in the literature. The approximation applies both when N is fixed, and when N has a Poisson distribution. We illustrate the procedure on a data set of trisomic spontaneous abortions observed in a two year period in New York City.

84 citations


Journal ArticleDOI
TL;DR: In this article, a reparameterization of the Sichel distribution is proposed and an algorithm for computing the maximum likelihood estimates of the new parameters is given, which can be implemented on a typical desktop microcomputer.
Abstract: The Sichel distribution is a three-parameter compound Poisson distribution. It is a versatile model for highly skewed frequency distributions of observed counts and has proved useful in fields as diverse as mining engineering, linguistics, ecology, industrial psychology, and market research. We propose a reparameterization of the Sichel distribution and give an algorithm, which can be implemented on a typical desktop microcomputer, for computing the maximum likelihood estimates of the new parameters. The reparameterization has a number of advantages over the old. In the important two-parameter special case of the Sichel distribution known as the inverse Gaussian Poisson the new parameters are the population mean and a shape parameter, and their maximum likelihood estimators are asymptotically uncorrelated. The reparameterization also lends itself to the convenient multivariate extension presented here. This distribution is well suited for modeling correlated count data whose marginal distribution...

83 citations


Journal ArticleDOI
TL;DR: In this article, the problem of choosing the base level or truncation level x 0 is addressed, and a graphical procedure for this choice, based on the equality of the mean and variance of the Poisson distribution, is proposed.

72 citations


Journal ArticleDOI
TL;DR: Recommendations are given as to the seriousness of the errors inherent in Wald's equations in relation to all of the other errors that are associated with the sampling process, and the choice between Wald's and Monte Carlo OC and ASN functions to describe the properties of a sampling plan.
Abstract: Equations for the stopping boundaries, and operating characteristic (OC) and average sample number (ASN) functions, of Wald's sequential probability ratio test (SPRT) are presented for the binomial, negative binomial, normal, and Poisson distributions. The effects of errors in Wald's OC and ASN equations due to overshooting the decision boundaries, and errors due to truncating, postponing decisions beyond the first stage, and taking more than one observation at each stage of the decision process are discussed. Monte Carlo procedures are used to show that Wald's equations overestimate the true error probabilities and underestimate the true ASN for a two-decision sampling plan based on the negative binomial distribution. A Monte Carlo procedure for modifying the decision boundaries to yield actual OC and ASN functions approximately equal to the desired ones is presented. Monte Carlo procedures are also used to examine the errors in Wald's OC and ASN functions when used to describe the OC and ASN functions of a composite three-decision sampling plan based on two single SPRT's using the negative binomial distribution. Wald's equations, in general, overestimate the true error probabilities and underestimate the true ASN even more for the three-decision case compared with the two-decision case. Recommendations are given as to the seriousness of the errors inherent in Wald's equations in relation to all of the other errors that are associated with the sampling process, and the choice between Wald's and Monte Carlo OC and ASN functions to describe the properties of a sampling plan.

69 citations


Journal ArticleDOI
TL;DR: In this paper, the definition de quasivraisemblance pour tenir compte de plusieurs strates de variation emboitees dans un modele lineaire generalise.
Abstract: On etend la definition de quasivraisemblance pour tenir compte de plusieurs strates de variation emboitees dans un modele lineaire generalise

Journal ArticleDOI
TL;DR: A single server queueing system under control-operating policy in which the server begins service only when the queue-size builds up to a preassigned fixed number and the interval of time required for startup of servicing, after each idle period, follows a general distribution with finite mean.

Journal ArticleDOI
TL;DR: Upper bounds on the left and right tails of the Poisson distribution are given and these bounds can be easily computed in a numerically stable way, even when thePoisson parameter is large.

Journal ArticleDOI
TL;DR: Rodriguez-Iturbe et al. as discussed by the authors used the first two moments, the spatial correlation and variance functions, and the spatial distribution of total rainfall for each storm.
Abstract: Eight years of summer storm rainfall observations from 93 stations in and around the 154 km2 Walnut Gulch catchment of the Agricultural Research Service, U.S. Department of Agriculture, in Arizona are processed to yield the total station depths of 428 storms. Statistical analysis of these random fields yields the first two moments, the spatial correlation and variance functions, and the spatial distribution of total rainfall for each storm. The sample is then split, and half is used to estimate, for each storm day, the distributions of the three parameters of each of the three conceptual spatial Poisson process models proposed previously by Rodriguez-Iturbe et al. (1986). The absolute and relative worth of the three Poisson models are evaluated by comparing their prediction of the spatial distribution of storm rainfall with observations from the second half of the sample. The effect of interstorm parameter variation is examined.

Journal ArticleDOI
TL;DR: Barbour and Hall as discussed by the authors used the Stein-Chen method to obtain the best known upper bounds for the discrepancy between the Poisson distribution and that of a sum of independent 0-1 random variables.
Abstract: Asymptotic expansions for the distributions of sums of independent nonnegative integer random variables in the neighbourhood of the Poisson distribution are derived, together with explicit estimates of the truncation error. Expansions are also derived for the expectations of at most polynomially growing functions of such sums. Applications to the Poisson binomial and Poisson negative binomial approximations are considered. The method used is an adaptation of the Stein-Chen approach. 1. Introduction. In Chen (1975), Stein's method of obtaining error bounds for normal approximations was introduced in the Poisson context, and was used to obtain rates of convergence in total variation to the Poisson distribution for sums of stationary sequences of 0-1 random variables. Chen also observed that Stein's method is in principle suited to developing asymptotic expansions as well as obtaining error estimates, and used it for independent 0-1 summands to derive the second term in such an expansion, together with an estimate of the remaining error: Kerstan (1964) had previously derived a similar result, with a sharper error estimate, by a quite different technique. The Stein-Chen method was refined in Barbour and Hall (1984) to yield the best known upper bounds for the discrepancy between the Poisson distribution and that of a sum of independent 0-1 random variables, as well as complementary lower bounds, and new estimates for the error remaining after the second term in the asymptotic expansion were also established. However, although Stein's method was potentially applicable also to higher-order expansions, it proved in practice too cumbersome to use. In this paper, it is shown how, by means of a simple identity, the Stein-Chen method can be made to yield a full asymptotic expansion with a minimum of difficulty. The error estimates obtained perform well in comparison with other estimates that are available. For sums of independent 0-1 random variables, they agree with those of Barbour and Hall, when the series is truncated after one term or after two. For the more general case of sums of independent nonnegative integer valued random variables, the error estimates obtained after two terms in the expansion differ only slightly from those in Barbour and Hall. Comparison with Kerstan's estimates, for integral nonnegative summands, of the error after one term of the expansion is more complicated. By and large, the techniques of this paper give better approximations, except when the summands themselves are almost precisely Poisson distributed. In Section 3, expansions are also obtained for the expectations of at most polynomially growing functions of the

Journal ArticleDOI
TL;DR: A procedure of genetic evaluation of reproductive traits such as litter size and survival in a polytocous species under the assumption of polygenic inheritance is described, using the concept of generalized linear models.
Abstract: A procedure of genetic evaluation of reproductive traits such as litter size and survival in a polytocous species under the assumption of polygenic inheritance is described. Conditional distributions of these traits are assumed to be Poisson and Bernoulli, respectively. Using the concept of generalized linear models, logarithmic (litter size) and probit (survival) functions are described as linear combinations of “nuisance” environmental effects and of transmitting abilities of sires or individual breeding values. The liability of survival is expressed conditionally to the logarithm of litter size. Inferences on location parameters are based on the mode of their joint posterior density assuming a prior multivariate normal distribution. A method of estimation of the dispersion parameters is also presented. The use of a “truncated” Poisson distribution is suggested to account for missing records on litter size.

Journal ArticleDOI
TL;DR: In this article, for the case of a portfolio with identical claim amount distributions, Gerber's error bound for the compound Poisson approximation is improved (in the case λ ⩾ 1) and the result can also be applied to more general portfolios by partitioning them into homogeneous subportfolios.
Abstract: Abstract For the case of a portfolio with identical claim amount distributions, Gerber's error bound for the compound Poisson approximation is improved (in the case λ ⩾ 1). The result can also be applied to more general portfolios by partitioning them into homogeneous subportfolios.

Journal ArticleDOI
TL;DR: In this article, the problem of linearisation of Lie-Poisson lineaires is studied. But it is not a linearisation problem, but a linearization of the Poisson lineaire.
Abstract: Si g est une algebre de Lie (sur R), alors l'espace dual g* porte une structure de Poisson lineaire π o appelee la structure de Lie-Poisson. On etudie le probleme de la linearisation

Journal ArticleDOI
Svante Janson1
TL;DR: In this article, a new sufficient condition for convergence to a Poisson distribution of a sequence of sums of dependent variables was given, which allows each summand to depend strongly on a few of the other variables and to depend weakly on the remaining ones.

Journal ArticleDOI
TL;DR: In this article, the authors considered the full-information best choice problem with a random number of observations and investigated the structure of the stopping set and the theoretical solution for the monotone case.

Journal ArticleDOI
TL;DR: In this article, an asymptotic theory for estimating the relative variance (relvariance) of an estimator of a total T by using a model with the form a + b/T is presented.
Abstract: Generalized variance functions (GVF's) are used in a number of sample surveys as a convenient method of publishing sampling errors. The method consists of estimating the relative variance (relvariance) of an estimator of a total T by using a model with the form a + b/T. Using the prediction approach to finite population sampling, some asymptotic theory is presented for estimators of totals that are linear combinations of sample cluster means from stratified, two-stage cluster samples. One choice of GVF estimator is shown to be consistent under a particular class of prediction models. The theory is illustrated by an empirical study in which two-stage stratified samples are selected from a population of households. The prediction model is one in which units within a stratum have a common mean and variance, units in the same cluster are correlated but units in different clusters are not, and in which the common variance is a quadratic function of the common mean in a stratum. Bernoulli and Poisson r...


Journal ArticleDOI
TL;DR: In this article, the authors extended the secretary problem to the case of inhomogeneous Poisson processes with unknown intensity functions, where the intensity function is either known or unknown up to a multiplicative constant.
Abstract: Cowan and Zabczyk (1978) have studied a continuous-time generalization of the so-called secretary problem, where options arise according to a homogeneous Poisson processes of known intensity λ. They gave the complete strategy maximizing the probability of accepting the best option under the usual no-recall condition. In this paper, the solution is extended to the case where the intensity λ is unknown, and also to the case of an inhomogeneous Poisson process with intensity function λ (t), which is either supposed to be known or known up to a multiplicative constant.

Journal ArticleDOI
TL;DR: In this article, the authors considered a more general situation where the system failures are distributed according to nonhomogeneous Poisson processes having Power Law intensity functions with gamma distributed intensity parameter.
Abstract: A compound (mixed) Poisson distribution is sometimes used as an alternative to the Poisson distribution for count data. Such a compound distribution, which has a negative binomial form, occurs when the population consists of Poisson distributed individuals, but with intensities which have a gamma distribution. A similar situation can occur with a repairable system when failure intensities of each system are different. A more general situation is considered where the system failures are distributed according to nonhomogeneous Poisson processes having Power Law intensity functions with gamma distributed intensity parameter. If the failures of each system in a population of repairable systems are distributed according to a Power Law process, but with different intensities, then a compound Power Law process provides a suitable model. A test, based on the ratio of the sample variance to the sample mean of count data from s-independent systems, provides a convenient way to determine if a compound model is appropriate. When a compound Power Law model is indicated, the maximum likelihood estimates of the shape parameters of the individual systems can be computed and homogeneity can be tested. If equality of the shape parameters is indicated, then it is possible to test whether the systems are homogeneous Poisson processes versus a nonhomogeneous alternative. If deterioration within systems is suspected, then the alternative in which the shape parameter exceeds unity would be appropriate, while if systems are undergoing reliability growth the alternative would be that the shape parameter is less than unity.

Journal ArticleDOI
TL;DR: In this paper, three temporal rainfall models such as Poisson rectangular pulse (PRP), Neyman-Scott white noise (NSWN), and NSRP (NSRP) are investigated.
Abstract: Characteristics and moment estimators of temporal rainfall models such as Poisson rectangular pulse (PRP), Neyman-Scott white noise (NSWN), and Neyman-Scott rectangular pulse (NSRP) are investigated. It is shown that PRP and NSWN have a correlation structure like that of an autoregressive moving average (ARMA) (1,1) model whereas the NSRP has a dependence structure like that of an ARMA (2, 2). The admissible regions of lag-1 and lag-2 autocorrelations are derived to demonstrate that in general they are more restricted than their ARMA counterparts. An additional property denoted as variance ratio, which is intimately related to the scale of fluctuation of a process, is defined and used for model comparison. The bias and mean square error properties of the moment estimators are investigated with emphasis on the NSWN model, for which it is suggested that the temporal scale T, defined by βT = 1, provides the most efficient estimators of the parameters. Parameter β defines the arrival of rain bursts relative to the origin of storm systems. All three models are fitted to an extensive data set covering hourly precipitation data at 38 stations in northeastern Colorado. The correlation and variance ratio plots are used to select the appropriate model for each month. However, it is shown that the diurnal periodicity of storm occurrence is predominant during the summer months, which is a characteristic not built into any of the temporal models used here.

Journal Article
TL;DR: It is shown how to achieve the background law below core edge energy in a way that provides the maximum signal-to-noise ratio by using a maximum likelihood (ML) estimation technique which provides unbiased and minimum mean square error estimates of all parameters of interest.
Abstract: In quantitative electron energy loss spectrometry, it is desirable to estimate the background law below core edge energy in a way that provides the maximum signal-to-noise ratio. Assuming an inverse power background model and independently Poisson distributed measurements, it is shown how to achieve this goal by using a maximum likelihood (ML) estimation technique which provides unbiased and minimum mean square error estimates of all parameters of interest. An efficient and computationally stable implementation of this procedure is proposed. Standard logarithmic least squares estimations are then compared with the ML approach and the gain in performance due to optimal processing is quantified.



Journal ArticleDOI
TL;DR: The negative binomial, Neyman type A, and Polya-Aeppli distributions are all clustered Poisson distributions, arising when groups of individuals occur at random (i.e., are Poissonian) and individuals within a group have their own distribution as mentioned in this paper.
Abstract: Count data in entomological studies can often be described by some form of contagious distribution, such as the negative binomial, Neyman type A, or Polya-Aeppli. These are all clustered Poisson distributions (their distributional forms are given in Section 2), arising when groups of individuals occur at random (i.e., are Poissonian) and individuals within a group have their own distribution. Whilst Pahl (1969) found strong evidence that the negative binomial distribution gives good fits to his 35 samples of data on the chrysomelid beetle Paropsis atomaria, Martin and Katti (1965) fitted a number of distributions to a variety of insect, plant, and animal data, and observed, for instance, that McGuire, Brindley, and Bancroft's (1957) data on the European cornborer Pyrausta nubilalis and Beall's (1940) data on the beet webworm Loxostege sticticalis are fitted well not only by the negative binomial, but also by the Neyman type A and other distributions. In such circumstances the fits given by different distributions are often very similar and the exact choice is not critical (see Kemp and Kemp, 1965). The recent work of Tripathi, Gurland, and Bhalerao (1986) gives further guidance on the choice of an appropriate distribution when this is not clearly indicated by the ecological context. The negative binomial, Neyman type A, and Polya-Aeppli distributions all have two parameters. When there are many samples of data in one data set; Anscombe (1949) and Bliss and Owen (1958) suggested fitting a model that is parsimonious in parameters, by keeping one of them constant from sample to sample. Their use of a common exponent parameter, k, when fitting families of negative binomials was criticised by Taylor, Woiwod, and Perry (1979), who considered that k is "an unstable parameter whose relationship with aggregation is doubtful." Much empirical evidence has accumulated in support of Taylor's power law

Journal ArticleDOI
TL;DR: In this paper, Stein introduced a new method for bounding the approximation error in central limit theory for dependent variables, which was subsequently developed by Chen for Poisson approximation and has proved very successful in the areas to which it has been applied.