scispace - formally typeset
Search or ask a question

Showing papers on "Poisson distribution published in 1989"


Journal ArticleDOI
TL;DR: A generalized expectation-maximization (GEM) algorithm is developed for Bayesian reconstruction, based on locally correlated Markov random-field priors in the form of Gibbs functions and on the Poisson data model, which reduces to the EM maximum-likelihood algorithm.
Abstract: A generalized expectation-maximization (GEM) algorithm is developed for Bayesian reconstruction, based on locally correlated Markov random-field priors in the form of Gibbs functions and on the Poisson data model. For the M-step of the algorithm, a form of coordinate gradient ascent is derived. The algorithm reduces to the EM maximum-likelihood algorithm as the Markov random-field prior tends towards a uniform distribution. Three different Gibbs function priors are examined. Reconstructions of 3-D images obtained from the Poisson model of single-photon-emission computed tomography are presented. >

674 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented Chen's results in a form that is easy to use and gave a multivariable extension, which gives an upper bound on the total variation distance between a sequence of dependent indicator functions and a Poisson process with the same intensity.
Abstract: Convergence to the Poisson distribution, for the number of occurrences of dependent events, can often be established by computing only first and second moments, but not higher ones. This remarkable result is due to Chen (1975). The method also provides an upper bound on the total variation distance to the Poisson distribution, and succeeds in cases where third and higher moments blow up. This paper presents Chen's results in a form that is easy to use and gives a multivariable extension, which gives an upper bound on the total variation distance between a sequence of dependent indicator functions and a Poisson process with the same intensity. A corollary of this is an upper bound on the total variation distance between a sequence of dependent indicator variables and the process having the same marginals but independent coordinates.

522 citations


Journal ArticleDOI
TL;DR: In this article, a microporous, anisotropic form of expanded polytetrafluoroethylene has been found to have a large negative major Poisson's ratio.
Abstract: A microporous, anisotropic form of expanded polytetrafluoroethylene has been found to have a large negative major Poisson's ratio. The value of Poisson's ratio varies with tensile strain and can attain values as large as -12. The microporous structure of the material is described and the mechanisms that lead to this large negative Poisson's ratio are identified. Micro-rotational degrees of freedom are observed, suggesting that a micropolar elasticity theory may be required to describe the mechanical properties.

424 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed tests for detecting extra-Poisson variation in counting data, which can be obtained as score tests against arbitrary mixed Poisson alternatives and are generalizations of tests of Fisher (1950) and Collings and Margolin (1985).
Abstract: Poisson regression models are widely used in analyzing count data. This article develops tests for detecting extra-Poisson variation in such situations. The tests can be obtained as score tests against arbitrary mixed Poisson alternatives and are generalizations of tests of Fisher (1950) and Collings and Margolin (1985). Accurate approximations for computing significance levels are given, and the power of the tests against negative binomial alternatives is compared with those of the Pearson and deviance statistics. One way to test for extra-Poisson variation is to fit models that parametrically incorporate and then test for the absence of such variation within the models; for example, negative binomial models can be used in this way (Cameron and Trivedi 1986; Lawless 1987a). The tests in this article require only the Poisson model to be fitted. Two test statistics are developed that are motivated partly by a desire to have good distributional approximations for computing significance levels. Simu...

335 citations


Journal ArticleDOI
Martin I. Reiman1, Alan Weiss1
TL;DR: In this article, a simple method of estimating the sensitivity of quantities obtained from simulation with respect to a class of parameters is presented, where sensitivity is defined as the derivative of the expectation of an expectation with respect a parameter.
Abstract: We present a simple method of estimating the sensitivity of quantities obtained from simulation with respect to a class of parameters Here sensitivity means the derivative of an expectation with respect to a parameter The class of parameters includes, for example, Poisson rates, discrete probabilities, and the mean and variance of a Normal distribution The method is extremely well suited to regenerative simulation, and can be implemented on extant simulations with little effort, increase in running time, or memory requirements It is based on some change-of-measure formulas derived from likelihood ratios In addition to the theorems that underly the technique, we present some numerical examples

261 citations


Journal ArticleDOI
TL;DR: In this article, the problem of variance specification in models for event count data is discussed, and several generalizations of the Poisson regression model, presented in King (1988), to allow for substantively interesting stochastic processes that do not fit into the poisson framework are discussed.
Abstract: This paper discusses the problem of variance specification in models for event count data. Event counts are dependent variables that can take on only nonnegative integer values, such as the number of wars or coups d'etat in a year. I discuss several generalizations of the Poisson regression model, presented in King (1988), to allow for substantively interesting stochastic processes that do not fit into the Poisson framework. Individual models that cope with, and help analyze, heterogeneity, contagion. and negative contagion are each shown to lead to specific statistical models for event count data. In addition. I derive a new generalized event count (GEC) model that enables researchers to extract significant amounts of new information from existing data by estimating features of these unobserved substantive processes. Applications of this model to congressional challenges of presidential vetoes and superpower conflict demonstrate the dramatic advantages of this approach.

259 citations


Journal ArticleDOI
TL;DR: In this article, the level of the process generating the observations changes over time and a recursion analogous to the Kalman filter is used to construct the likelihood function and to make predictions of future observations.
Abstract: Time series sometimes consist of count data in which the number of events occurring in a given time interval is recorded. Such data are necessarily nonnegative integers, and an assumption of a Poisson or negative binomial distribution is often appropriate. This article sets ups a model in which the level of the process generating the observations changes over time. A recursion analogous to the Kalman filter is used to construct the likelihood function and to make predictions of future observations. Qualitative variables, based on a binomial or multinomial distribution, may be handled in a similar way. The model for count data may be extended to include explanatory variables. This enables nonstochastic slope and seasonal components to be included in the model, as well as permitting intervention analysis. The techniques are illustrated with a number of applications, and an attempt is made to develop a model-selection strategy along the lines of that used for Gaussian structural time series models. The appli...

220 citations


Journal ArticleDOI
TL;DR: This paper focuses on the method of back-calculation from AIDS incidence data through use of the incubation period distribution to obtain estimates of the numbers previously infected to obtain short-term projections of AIDS incidence in the United States.
Abstract: Short-term projections of AIDS incidence are critical for assessing future health care needs This paper focuses on the method of back-calculation for obtaining short-term projections The approach consists of back-calculating from AIDS incidence data through use of the incubation period distribution to obtain estimates of the numbers previously infected The numbers previously infected are then projected forward to obtain short-term projections An approach is suggested for accounting for new infections in short-term projections of AIDS incidence Back-calculation requires accurate AIDS incidence data A method which is computationally easy to implement is proposed for estimating the distribution of the delays in reporting AIDS cases It was found that the reporting delay distribution in the United States varies by geographic region of diagnosis Back-calculation also requires a reliable estimate of the incubation period distribution Statistical issues associated with estimating the incubation period distribution are considered The methods are applied to obtain short-term projections of AIDS incidence in the United States The projected cumulative AIDS incidence in the US by the end of 1992 was 287,100 under the assumption that there are no new infections after 1 July 1987, and 330,600 under the assumption that the infection rate remains constant These projections do not account for the new broadened AIDS surveillance definitions or the underreporting of AIDS cases to the Centers for Disease Control

146 citations


Journal ArticleDOI
TL;DR: In this article, the use of Poisson regression in the computer package GLIM with an example from historical geography is described, where the Apprentice migration to Edinburgh is regressed on a combination of categorical, count, and continuous explanatory variables.
Abstract: In geographical research the data of interest are often in the form of counts. Standard regression analysis is inappropriate for such data, but if certain assumptions are met, a form of regression based on the Poisson distribution can be used. This paper illustrates the use of Poisson regression in the computer package GLIM with an example from historical geography. Apprentice migration to Edinburgh is regressed on a combination of categorical, count, and continuous explanatory variables.

131 citations


Journal ArticleDOI
TL;DR: Focusing initially on Poisson arrival processes, a method is provided to calculate fn0 for any 'admissible' function f, and a large class of queueing networks for which several standard quantities of interest are shown to be admissible.
Abstract: Many quantities of interest in open queueing systems are expected values which can be viewed as functions of the arrival rate to the system. We are thus led to consider fλ, where λ is the arrival rate, and f represents some quantity of interest. The aim of this paper is to investigate the behavior of fλ, for λ near zero. This 'light traffic' information is obtained in the form of f0 and its derivatives, fn0, n ≥ 1. Focusing initially on Poisson arrival processes, we provide a method to calculate fn0 for any 'admissible' function f. We describe a large class of queueing networks for which we show several standard quantities of interest to be admissible. The proof that the method yields the correct values for the derivatives involves an interchange of limits, whose justification requires a great deal of effort. The determination of fn0 involves consideration of sample paths with n + 1 or fewer arrivals over all of time. These calculations are illustrated via several simple examples. These results can be extended to arrival processes which, although not Poisson, are 'driven' by a Poisson process. We carry out the details for phase type renewal processes and nonstationary Poisson processes.

122 citations


Journal ArticleDOI
TL;DR: In this paper, the authors introduce quadratic Poisson structures on Lie groups associated with a class of solutions of the modified Yang-Baxter equation and apply them to the Hamiltonian description of Lax systems.
Abstract: We introduce quadratic Poisson structures on Lie groups associated with a class of solutions of the modified Yang-Baxter equation and apply them to the Hamiltonian description of Lax systems. The formal analog of these brackets on associative algebras provides second structures for certain integrable equations. In particular, the integrals of the Toda flow on generic orbits are shown to satisfy recursion relations. Finally, we exhibit a third order Poisson bracket for which ther-matrix approach is feasible.

Journal ArticleDOI
TL;DR: Comparisons of the a priori uniform and nonuniform Bayesian algorithms to the maximum-likelihood algorithm are carried out using computer-generated noise-free and Poisson randomized projections.
Abstract: A method that incorporates a priori uniform or nonuniform source distribution probabilistic information and data fluctuations of a Poisson nature is presented. The source distributions are modeled in terms of a priori source probability density functions. Maximum a posteriori probability solutions, as determined by a system of equations, are given. Interactive Bayesian imaging algorithms for the solutions are derived using an expectation maximization technique. Comparisons of the a priori uniform and nonuniform Bayesian algorithms to the maximum-likelihood algorithm are carried out using computer-generated noise-free and Poisson randomized projections. Improvement in image reconstruction from projections with the Bayesian algorithm is demonstrated. Superior results are obtained using the a priori nonuniform source distribution. >

Journal ArticleDOI
01 Feb 1989-Ecology
TL;DR: This method of fitting the data from a set of species observations to the negative exponential model provided an estimate of the total number of species for a taxonomic category (rare vascular plants) and a region (southern Appalachians).
Abstract: Four data sets containing randomly distributed species observations were computer generated. Each data set was characterized by a different species-abundance distribution: (1) the distribution of rare vascular plants in the southern Appalachians, (2) a uniform distribution, (3) a Poisson distribution, and (4) a truncated lognormal distri- bution. A computerized sampling methodology (Miller 1986) recorded the statistically signif- icant and unique species-area relations distinguishing each data set. The most robust species-area relations (i.e., highest r2 adjusted values) typified the rare southern Appalachian plant distribution and the truncated canonical lognormal distribution. A method is presented for obtaining a reliable estimate of the total number of species characterizing a taxonomic group within a region. This method is based on analyses of the accumulation of new species vs. the accumulation of observations recorded from a sampling of each of the data sets. A statistically robust negative exponential model was fitted to each of these accumulation distributions. From this model, 195 rare vascular plant species were predicted to inhabit the central southern Appalachians region. This estimate was in close agreement with the actual total of 188 species determined from 150 yr of field observations. Thus, this method of fitting the data from a set of species observations to the negative exponential model provided an estimate of the total number of species for a taxonomic category (rare vascular plants) and a region (southern Appalachians).

Journal ArticleDOI
TL;DR: It is shown that it is possible to do asymptotic likelihood inference for software reliability models based on order statistics or nonhomogeneous Poisson processes, with asymPTotic confidence levels for interval estimates of parameters, for the conditional failure rate of the software.
Abstract: There are many software reliability models that are based on the times of occurrences of errors in the debugging of software. It is shown that it is possible to do asymptotic likelihood inference for software reliability models based on order statistics or nonhomogeneous Poisson processes, with asymptotic confidence levels for interval estimates of parameters. In particular, interval estimates from these models are obtained for the conditional failure rate of the software, given the data from the debugging process. The data can be grouped or ungrouped. For someone making a decision about when to market software, the conditional failure rate is an important parameter. The use of interval estimates is demonstrated for two data sets that have appeared in the literature. >

Journal ArticleDOI
TL;DR: In this article, the Stein-Chen method and suitable couplings are used to obtain general upper bounds for the variational distance between a sum of Bernoulli random variables and a Poisson random variable having the same mean.
Abstract: Let W be a sum of Bernoulli random variables and U λ a Poisson random variable having the same mean λ = EW. Using the Stein-Chen method and suitable couplings, general upper bounds for the variational distance between W and U λ are given. These bounds are applied to problems of occupancy, using sampling with and without replacement and Polya sampling, of capture-recapture, of spacings and of matching and menage.

Journal ArticleDOI
TL;DR: In this article, the total claims distribution over a fixed period of time with time dependent claim amounts is considered and a representation for the associated density function is found under certain conditions, including the important case with Poisson or mixed Poisson claim number processes and constant inflation.
Abstract: The total claims distribution over a fixed period of time with time dependent claim amounts is considered. A representation for the associated density function is found under certain conditions, including the important case with Poisson or mixed Poisson claim number processes and constant inflation. Methods of evaluation of this density are considered, and the cases with exponential claim sizes and regular variation of the tail are discussed in more detail.

Journal Article
TL;DR: In this article, the authors examined refined approximations to the distribution of sums of nonnegative integer valued random variables, near a Poisson limit, and derived asymptotic expansions for point and tail probabilities.
Abstract: The paper examines refined approximations to the distribution of sums of indepen- dent non-negative integer valued random variables, near a Poisson limit. Asymptotic expansions are derived for point and tail probabilities, and explicit estimates of the relative error of approximation are given. For Bernoulli summands approximated by an lth order expansion, the relative error in the approximation to point probabilities in the body of the distribution is shown to be of order min(1,1/Y& =pi)2=1p1+'.

Proceedings ArticleDOI
27 Nov 1989
TL;DR: Multipath profiles obtained from propagation measurements done on several manufacturing floors and college campus laboratories at 910 MHz are analyzed and a suitable model for the distribution of the gain coefficients is found to be the log-normal distribution.
Abstract: Multipath profiles obtained from propagation measurements done on several manufacturing floors and college campus laboratories at 910 MHz are analyzed to model the arrival of the paths, the respective amplitudes of the paths, and the received power. The discrepancies between the empirical distribution of the arriving paths and Poisson arrivals are discovered. The modified Poisson process is shown to fit the arriving paths closely. Rayleigh, Weibull, Nakagami, log-normal, and Suzuki distributions are considered as potential models for the amplitude of the arriving paths. The path amplitudes are shown to closely follow a log-normal rather than a Rayleigh distribution. The delay power spectrum is shown to fit an exponential function closely. The statistics of RMS multipath delay spread and the values of the distance/power law gradient are also computed and compared for these experimental sites. The interarrival times of the signals were modeled by the Weibull distribution. A model for the distribution of the number of signals was presented using the modified Beta distribution. Finally, for the data with no threshold at the receiver a suitable model for the distribution of the gain coefficients is found to be the log-normal distribution. >

Journal ArticleDOI
TL;DR: An alternative model for defect data that exhibits clustering is employed and used to design acceptance sampling plans and a dramatic difference is shown between these plans and those determined under the Poisson distribution assumption is shown.
Abstract: Whenever defect data is encountered in industrial quality control applications, the Poisson distribution is generally assumed to be the underlying distribution. It has been widely reported that the defect distributions in integrated circuit fabrication exhibit clustering behavior, a condition which invalidates the Poisson distribution assumption. In this paper we employ an alternative model for defect data that exhibits clustering. To demonstrate the impact of the proposed model, we use it to design acceptance sampling plans and show a dramatic difference between these plans and those determined under the Poisson distribution assumption. Although advances in quality control and techniques such as design for manufacture have eliminated the need for acceptance sampling in many areas, integrated circuit fabrication still involves processes with high variability even using state of the art equipment. Thus acceptance sampling is widely used at various stages of manufacture to insure specified quality levels when 100 percent inspection is infeasible.

Journal ArticleDOI
TL;DR: In this article, a recursive relation was developed to evaluate numerically the photon-counting distributions and their factorial moments with excellent accuracy, together with a generalized method of steepest descent.
Abstract: The K distribution is used in a number of areas of scientific endeavor. In optics, it provides a useful statistical description for fluctuations of the irradiance (and the electric field) of light that has been scattered or transmitted through random media (e.g., the turbulent atmosphere). The Poisson transform of the K distribution describes the photon-counting statistics of light whose irradiance is K distributed. The K-distribution family can be represented in a multiply stochastic (compound) form whereby the mean of a gamma distribution is itself stochastic and is described by a member of the gamma family of distributions. Similarly, the family of Poisson transforms of the K distributions can be represented as a family of negative-binomial transforms of the gamma distributions or as Whittaker distributions. The K distributions have heretofore had their origins in random-walk models; the multiply stochastic representations provide an alternative interpretation of the genesis of these distributions and their Poisson transforms. By multiple compounding, we have developed a new transform pair as a possibly useful addition to the K-distribution family. All these distributions decay slowly and are difficult to calculate accurately by conventional formulas. A recursion relation, together with a generalized method of steepest descent, has been developed to evaluate numerically the photon-counting distributions and their factorial moments with excellent accuracy.

Journal ArticleDOI
TL;DR: The Markov formulation of the RMR model uses the same biological hypotheses as the original version with two statistical approximations deleted, and a basis is provided for an alternative approach to calculating survival probabilities.
Abstract: Tobias' repair-misrepair (RMR) model of cell survival is formulated as a Markov process, a sequence of discrete repair steps occurring at random times, and the probability of a sequence of viable repairs is calculated. The Markov formulation describes the time evolution of the probability distribution for the number of lesions in a cell. The probability of cell survival is calculated from the distribution of the initial number of lesions and the probabilities of the repair events. The production of lesions is formulated in accordance with the principles of microdosimetry, and the distribution of the initial number of lesions is obtained as an approximation for high and low linear energy transfer cases. The Markov formulation of the RMR model uses the same biological hypotheses as the original version with two statistical approximations deleted. These approximations are the neglect of the effect of statistical fluctuations in calculating the average rate of repair of lesions and the assumption that the final number of unrepaired and lethally misrepaired lesions has a Poisson distribution. The quantitative effect of these approximations is calculated, and a basis is provided for an alternative approach to calculating

01 Jan 1989
TL;DR: In this article, the influence of the permanent customers on queue length and sojourn times of the Poisson customers is studied using results from queuing theory and from the theory of branching processes.
Abstract: The authors examine an M/G/1 FCFS (first come, first served) queue with two types of customers: ordinary customers, who arrive according to a Poisson process, and permanent customers, who immediately return to the end of the queue after having received a service. The influence of the permanent customers on queue length and sojourn times of the Poisson customers is studied using results from queuing theory and from the theory of branching processes. In particular, it is shown that, when the service time distributions of the Poisson customers and all K permanent customers are negative exponential with identical means, the queue length and sojourn time distributions of the Poisson customers are the (K+1)-fold convolution of those for the case without permanent customers. >

Journal ArticleDOI
TL;DR: In this article, the design quantiles of storms are derived directly in terms of design life and the risk of occurrence for Poisson and negative binomial storm arrival processes, and the use of Bootstrap methods is demonstrated.
Abstract: The design quantiles of storms are derived directly in terms of design life and the risk of occurrence. Estimators for these quantiles based on exceedance series are discussed for the cases of Poisson and negative binomial storm arrival processes. The use of Bootstrap methods to estimate confidence intervals for the design quantiles is demonstrated.

Journal ArticleDOI
TL;DR: In this article, the extremal properties of the shot noise process X(t) are investigated in a natural way through a discrete-time process which records the states of (X(t)} at the points of ti.
Abstract: Consider the shot noise process X(t):= .ih(t - rj), t---0, where h is a bounded positive non-increasing function supported on a finite interval, and the r;'s are the points of a renewal process tj on [0, oo). In this paper, the extremal properties of (X(t)} are studied. It is shown that these properties can be investigated in a natural way through a discrete-time process which records the states of (X(t)} at the points of ti. The important special case where 7t is Poisson is treated in detail, and a domain-of-attraction result for the compound Poisson distribution is obtained as a by-product. EXTREME VALUE; POINT PROCESS; RENEWAL PROCESS

Journal ArticleDOI
TL;DR: In this article, a Poisson regression model of the macroeconomic determinants of the total number of these petitions yearly is developed and estimated based on the decision to file an escape clause petition by a firm.
Abstract: Based on the decision to file an escape clause petition by a firm, a Poisson regression model of the macroeconomic determinants of the total number of these petitions yearly is developed and estimated. The Poisson specification conforms to the fact that the number of escape clause petitions is a non-negative integer with a skewed probability distribution. In addition, the number of potential petitioners is controlled for in the specification. The empirical results suggest that both domestic and international factors affect the decision to file an escape clause petition. The legal environment is found to be a determinant as well. Copyright 1989 by MIT Press.

Journal ArticleDOI
TL;DR: A heterogeneity of the labelling of Ki-67 in histologically benign meningiomas is indicated and more attention should be paid to this heterogeneity for cell kinetic studies in brain tumours, and one should be aware that regrowth is not only determined by the proliferative capacity.
Abstract: Ki-67 monoclonal antibody reacts with a nuclear protein only expressed in proliferating cells. In order to evaluate the applicability of Ki-67 in meningiomas we determined the index of labelled cells in 52 biopsies. Two counting methods were used to calculate the labelling index. The mean Ki-67 index was determined by the number of positive Ki-67 cells in three randomly chosen fields divided by the total number of cells in the three fields. The preparations were also screened for the area with the highest percentage of positive Ki-67 cells, yielding the 'highest' Ki-index. The three patients who developed a recurrence did not show an elevated Ki-67 index by any of the two counting methods. Because of the differences observed between the two counting methods and the differences of the Ki-67 indices in the three randomly chosen fields we performed a statistical analysis to evaluate the possibility of a heterogeneous distribution of Ki-67 positive cells in meningiomas. Homogeneity was rejected if the data showed no Poisson distribution. We found that five of the 13 tumours which had a mean Ki-67 index greater than 1.00% were heterogeneous (38.4%). One of the highest differences between the two counting methods (5.05%) was noted in the only tumour with local signs of malignancy. Our results indicate a heterogeneity of the labelling of Ki-67 in histologically benign meningiomas and emphasise the importance of representative sampling. More attention should be paid to this heterogeneity for cell kinetic studies in brain tumours, and one should be aware that regrowth is not only determined by the proliferative capacity.

Journal ArticleDOI
TL;DR: In this article, the elasticity of bank failures with respect to a set of explanatory variables is estimated using a maximum-likelihood Poisson estimator, and the resulting estimates are compared with their OLS counterparts.

Journal ArticleDOI
TL;DR: In this paper, an asymptotic expansion for the distribution of the sum of independent zero-one random variables in case where this surname has variance σ n 2 → ∞ is given.

Journal ArticleDOI
TL;DR: Simulation results indicate that the proposed method performs quite well, and it is apparently superior to the approach of Hochberg (1981, Communications in Statistics--Theory and Methods A10, 1719-1732) for values of zeta far from 1/2.
Abstract: Halperin, Gilbert, and Lachin (1987, Biometrics 43, 71-80) obtain confidence intervals for Pr(X less than Y) based on the two-sample Wilcoxon statistic for continuous data. Their approach is applied here to ordered categorical data and right-censored continuous data, using the generalization zeta = Pr(X less than Y) + 1/2Pr(X = Y) to account for ties. Deviations from nominal coverage probability for various sample sizes and values of zeta are obtained via simulation of either three or six ordered categories based on underlying Poisson or exponential distributions. The simulation results indicate that the proposed method performs quite well, and it is apparently superior to the approach of Hochberg (1981, Communications in Statistics--Theory and Methods A10, 1719-1732) for values of zeta far from 1/2.

Journal ArticleDOI
TL;DR: In this paper, the ruin probability for a compound Poisson risk process with a general premium rate p(r) depending on the reserve r is calculated using a simple numerical method.
Abstract: The purpose of this paper is to show how the ruin probability can be found for a compound Poisson risk process with a general premium rate p(r) depending on the reserve r, and it is illustrated how the probability of ruin can be calculated using a simple numerical method.