scispace - formally typeset
Search or ask a question

Showing papers on "Poisson distribution published in 2007"


Journal ArticleDOI
TL;DR: In this paper, the effects of Poisson's ratio in the elastic deformation of materials, intact rocks, and rock masses are briefly reviewed, and the reported values of POI for some elements, materials, and minerals are compiled while typical ranges of values are presented for some rocks and granular soils.

669 citations


Journal ArticleDOI
TL;DR: In this article, a simple and compact analytical formula is proposed for approximating the Voronoi cell's size-distribution function in the practically important 2D and 3D cases as well Denoting the dimensionality of the space by d ( d = 1, 2, 3 ) the f ( y ) = Const * y ( 3 d - 1 ) / 2 exp ( - ( 3d + 1 ) y / 2 ) compact form is suggested for the normalized cell-size distribution function.
Abstract: Poisson Voronoi diagrams are useful for modeling and describing various natural patterns and for generating random lattices Although this particular space tessellation is intensively studied by mathematicians, in two- and three-dimensional (3D) spaces there is no exact result known for the size distribution of Voronoi cells Motivated by the simple form of the distribution function in the 1D case, a simple and compact analytical formula is proposed for approximating the Voronoi cell's size-distribution function in the practically important 2D and 3D cases as well Denoting the dimensionality of the space by d ( d = 1 , 2 , 3 ) the f ( y ) = Const * y ( 3 d - 1 ) / 2 exp ( - ( 3 d + 1 ) y / 2 ) compact form is suggested for the normalized cell-size distribution function By using large-scale computer simulations the viability of the proposed distribution function is studied and critically discussed

517 citations


Posted Content
TL;DR: A review of the existing literature on Poisson mixtures by bringing together a great number of properties, while, at the same time, providing tangential information on general mixtures is made in this paper.
Abstract: Mixed Poisson distributions have been used in a wide range of scientific fields for modelling non-homogeneous populations. This paper aims at reviewing the existing literature on Poisson mixtures by bringing together a great number of properties, while, at the same time, providing tangential information on general mixtures. A selective presentation of some of the most prominent members of the family of Poisson mixtures is made.

285 citations


Journal ArticleDOI
TL;DR: A review of the existing literature on Poisson mixtures by bringing together a great number of properties, while, at the same time, providing tangential information on general mixtures is provided in this article.
Abstract: Summary Mixed Poisson distributions have been used in a wide range of scientific fields for modeling non homogeneous populations. This paper aims at reviewing the existing literature on Poisson mixtures by bringing together a great number of properties, while, at the same time, providing tangential information on general mixtures. A selective presentation of some of the most prominent members of the family of Poisson mixtures is made.

273 citations


Posted Content
TL;DR: In this paper, the SIR dynamics can be modeled with a system of three nonlinear ODE's, which makes use of the probability generating function (PGF) formalism for representing the degree distribution of a random network.
Abstract: Random networks with specified degree distributions have been proposed as realistic models of population structure, yet the problem of dynamically modeling SIR-type epidemics in random networks remains complex. I resolve this dilemma by showing how the SIR dynamics can be modeled with a system of three nonlinear ODE's. The method makes use of the probability generating function (PGF) formalism for representing the degree distribution of a random network and makes use of network-centric quantities such as the number of edges in a well-defined category rather than node-centric quantities such as the number of infecteds or susceptibles. The PGF provides a simple means of translating between network and node-centric variables and determining the epidemic incidence at any time. The theory also provides a simple means of tracking the evolution of the degree distribution among susceptibles or infecteds. The equations are used to demonstrate the dramatic effects that the degree distribution plays on the final size of an epidemic as well as the speed with which it spreads through the population. Power law degree distributions are observed to generate an almost immediate expansion phase yet have a smaller final size compared to homogeneous degree distributions such as the Poisson. The equations are compared to stochastic simulations, which show good agreement with the theory. Finally, the dynamic equations provide an alternative way of determining the epidemic threshold where large-scale epidemics are expected to occur, and below which epidemic behavior is limited to finite-sized outbreaks.

273 citations


Journal ArticleDOI
TL;DR: The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled, and that model specification may be improved by testing extra-Variation functions for significance.

243 citations


Journal ArticleDOI
TL;DR: The zero-inflated negative binomial (ZINB) regression model with smoothing is introduced for modeling count data with many zero-valued observations, and its use is illustrated with shark bycatch data from the eastern Pacific Ocean tuna purse-seine fishery for 1994-2004.

223 citations


Journal ArticleDOI
TL;DR: In this article, the authors develop a new approach to change-point modeling that allows the number of change-points in the observed sample to be unknown, assuming that regime durations have a Poisson distribution.
Abstract: This paper develops a new approach to change-point modelling that allows the number of change-points in the observed sample to be unknown. The model we develop assumes that regime durations have a Poisson distribution. It approximately nests the two most common approaches: the time-varying parameter (TVP) model with a change-point every period and the change-point model with a small number of regimes. We focus considerable attention on the construction of reasonable hierarchical priors both for regime durations and for the parameters that characterize each regime. A Markov chain Monte Carlo posterior sampler is constructed to estimate a version of our model, which allows for change in conditional means and variances. We show how real-time forecasting can be done in an efficient manner using sequential importance sampling. Our techniques are found to work well in an empirical exercise involving U.S. GDP growth and inflation. Empirical results suggest that the number of change-points is larger than previously estimated in these series and the implied model is similar to a TVP (with stochastic volatility) model.

213 citations


Journal ArticleDOI
TL;DR: A spatial scan statistic based on an exponential model to handle either uncensored or censored continuous survival data and performs well for different survival distribution functions including the exponential, gamma, and log-normal distributions.
Abstract: Spatial scan statistics with Bernoulli and Poisson models are commonly used for geographical disease surveillance and cluster detection. These models, suitable for count data, were not designed for data with continuous outcomes. We propose a spatial scan statistic based on an exponential model to handle either uncensored or censored continuous survival data. The power and sensitivity of the developed model are investigated through intensive simulations. The method performs well for different survival distribution functions including the exponential, gamma, and log-normal distributions. We also present a method to adjust the analysis for covariates. The cluster detection method is illustrated using survival data for men diagnosed with prostate cancer in Connecticut from 1984 to 1995.

186 citations


Book ChapterDOI
TL;DR: In this paper, the notion of connection in the context of Courant algebroids was extended to obtain a new characterization of generalized Kaehler geometry, and a new notion of isomorphism between holomorphic Poisson manifolds was established.
Abstract: We first extend the notion of connection in the context of Courant algebroids to obtain a new characterization of generalized Kaehler geometry. We then establish a new notion of isomorphism between holomorphic Poisson manifolds, which is non-holomorphic in nature. Finally we show an equivalence between certain configurations of branes on Poisson varieties and generalized Kaehler structures, and use this to construct explicitly new families of generalized Kaehler structures on compact holomorphic Poisson manifolds equipped with positive Poisson line bundles (e.g. Fano manifolds). We end with some speculations concerning the connection to non-commutative algebraic geometry.

162 citations


Journal ArticleDOI
TL;DR: In this article, the authors present several extensions of the most familiar models for count data, the Poisson and negative binomial models, and develop an encompassing model for two well known variants of the NB1 and NB2 forms.
Abstract: This study presents several extensions of the most familiar models for count data, the Poisson and negative binomial models. We develop an encompassing model for two well known variants of the negative binomial model (the NB1 and NB2 forms). We then propose some alternative approaches to the standard log gamma model for introducing heterogeneity into the loglinear conditional means for these models. The lognormal model provides a versatile alternative specification that is more flexible (and more natural) than the log gamma form, and provides a platform for several â¬Stwo partâ¬? extensions, including zero inflation, hurdle and sample selection models. We also resolve some features in Hausman, Hall and Grilichesâ¬"s (1984) widely used panel data treatments for the Poisson and negative binomial models that appear to conflict with more familiar models of fixed and random effects. Finally, we consider a bivariate Poisson model that is also based on the lognormal heterogeneity model. Two recent applications have used this model. We suggest that the correlation estimated in their model frameworks is an ambiguous measure of the correlation of the variables of interest, and may substantially overstate it. We conclude with a detailed application of the proposed methods using the data employed in one of the two aforementioned bivariate Poisson studies.

Journal ArticleDOI
TL;DR: Nonparametric Poisson intensity and density estimation methods studied in this paper offer near minimax convergence rates for broad classes of densities and intensities with arbitrary levels of smoothness and it is demonstrated that platelet-based estimators in two dimensions exhibit similar near-optimal error convergence rates.
Abstract: The nonparametric Poisson intensity and density estimation methods studied in this paper offer near minimax convergence rates for broad classes of densities and intensities with arbitrary levels of smoothness. The methods and theory presented here share many of the desirable features associated with wavelet-based estimators: computational speed, spatial adaptivity, and the capability of detecting discontinuities and singularities with high resolution. Unlike traditional wavelet-based approaches, which impose an upper bound on the degree of smoothness to which they can adapt, the estimators studied here guarantee nonnegativity and do not require any a priori knowledge of the underlying signal's smoothness to guarantee near-optimal performance. At the heart of these methods lie multiscale decompositions based on free-knot, free-degree piecewise-polynomial functions and penalized likelihood estimation. The degrees as well as the locations of the polynomial pieces can be adapted to the observed data, resulting in near-minimax optimal convergence rates. For piecewise-analytic signals, in particular, the error of this estimator converges at nearly the parametric rate. These methods can be further refined in two dimensions, and it is demonstrated that platelet-based estimators in two dimensions exhibit similar near-optimal error convergence rates for images consisting of smooth surfaces separated by smooth boundaries.


Journal ArticleDOI
TL;DR: In this article, the Lee-Carter and Poisson log-bilinear models were used to forecast future mortality rates for public policy, as well as for the management of financial institutions.
Abstract: Mortality improvements pose a challenge for the planning of public retirement systems as well as for the private life annuities business. For public policy, as well as for the management of financial institutions, it is important to forecast future mortality rates. Standard models for mortality forecasting assume that the force of mortality at age x in calendar year t is of the form exp({alpha}x + sx{kappa}t ). The log of the time series of age-specific death rates is thus expressed as the sum of an age-specific component {alpha}x that is independent of time and another component that is the product of time-varying parameter {kappa}t reflecting the general level of mortality, and an age-specific component sx that represents how rapidly or slowly mortality at each age varies when the general level of mortality changes. This model is fitted to historical data. The resulting estimated {kappa}t 's are then modeled and projected as stochastic time series using standard Box–Jenkins methods. However, the estimated sx's exhibit an irregular pattern in most cases, and this produces irregular projected life tables. This article demonstrates that it is possible to smooth the estimated sx's in the Lee–Carter and Poisson log-bilinear models for mortality projection. To this end, penalized least-squares/maximum likelihood analysis is performed. The optimal value of the smoothing parameter is selected with the help of cross validation.

Posted Content
TL;DR: In this article, a non-Markovian renewal process with a waiting time distribution described by the Mittag-Leffler function is analyzed, and it is shown that this distribution plays a fundamental role in the infinite thinning procedure of a generic renewal process governed by a power asymptotic waiting time.
Abstract: It is our intention to provide via fractional calculus a generalization of the pure and compound Poisson processes, which are known to play a fundamental role in renewal theory, without and with reward, respectively. We first recall the basic renewal theory including its fundamental concepts like waiting time between events, the survival probability, the counting function. If the waiting time is exponentially distributed we have a Poisson process, which is Markovian. However, other waiting time distributions are also relevant in applications, in particular such ones with a fat tail caused by a power law decay of its density. In this context we analyze a non-Markovian renewal process with a waiting time distribution described by the Mittag-Leffler function. This distribution, containing the exponential as particular case, is shown to play a fundamental role in the infinite thinning procedure of a generic renewal process governed by a power asymptotic waiting time. We then consider the renewal theory with reward that implies a random walk subordinated to a renewal process.

Journal ArticleDOI
TL;DR: The authors consider the difference between the Hausman, Hall and Griliches (HHG) FENB model and a more conventional negative binomial model using a log gamma heterogeneity term, and consider the lognormal model as an alternative RE Poisson model in which the common effect appears in a natural index function form.
Abstract: The most familiar fixed effects (FE) and random effects (RE) panel data treatments for count data were proposed by Hausman, Hall and Griliches (HHG) (1984). The Poisson FE model is particularly simple and is one of a small few known models in which the incidental parameters problem is, in fact, not a problem. The same is not true of the negative binomial (NB) model. Researchers are sometimes surprised to find that the HHG formulation of the FENB model allows an overall constant – a quirk that has also been documented elsewhere. We resolve the source of the ambiguity, and consider the difference between the HHG FENB model and a ‘true’ FENB model that appears in the familiar index function form. The familiar RE Poisson model using a log gamma heterogeneity term produces the NB model. The HHG RE NB model is also unlike what might seem the natural application in which the heterogeneity term appears as an additive common effect in the conditional mean. We consider the lognormal model as an alternative RENB model in which the common effect appears in a natural index function form.

Posted Content
TL;DR: In this paper, the authors proposed a particle filter scheme for a class of partially-observed multivariate diffusions, which does not require approximations of the transition and/or the observation density using timediscretisations.
Abstract: In this paper we introduce a novel particle filter scheme for a class of partially-observed multivariate diffusions. %continuous-time dynamic models where the %signal is given by a multivariate diffusion process. We consider a variety of observation schemes, including diffusion observed with error, observation of a subset of the components of the multivariate diffusion and arrival times of a Poisson process whose intensity is a known function of the diffusion (Cox process). Unlike currently available methods, our particle filters do not require approximations of the transition and/or the observation density using time-discretisations. Instead, they build on recent methodology for the exact simulation of the diffusion process and the unbiased estimation of the transition density as described in \cite{besk:papa:robe:fear:2006}. %In particular, w We introduce the Generalised Poisson Estimator, which generalises the Poisson Estimator of \cite{besk:papa:robe:fear:2006}. %Thus, our filters avoid the systematic biases caused by %time-discretisations and they have significant computational %advantages over alternative continuous-time filters. These %advantages are supported theoretically by a A central limit theorem is given for our particle filter scheme.

Journal ArticleDOI
TL;DR: In this article, the authors introduce the multivariate autoregressive conditional double poisson model to deal with discreteness, overdispersion and both auto and cross-correlation arising with multivariate counts.

Journal ArticleDOI
TL;DR: In this article, the authors examined finite mixtures of multivariate Poisson distributions as an alternative class of models for multivariate count data, allowing for both overdispersion in the marginal distributions and negative correlation, while they are computationally tractable using standard ideas from finite mixture modelling.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the Poisson distribution maximises entropy in the class of ultra log-concave distributions, extending a result of Harremoes, using ideas concerning logconcavity and a semigroup action involving adding Poisson variables and thinning.

Proceedings Article
03 Dec 2007
TL;DR: This model can play the role of the prior in nonparametric Bayesian learning scenarios where multiple latent features are associated with the observed data and each feature can have multiple appearances or occurrences within each data point.
Abstract: We present a probability distribution over non-negative integer valued matrices with possibly an infinite number of columns. We also derive a stochastic process that reproduces this distribution over equivalence classes. This model can play the role of the prior in nonparametric Bayesian learning scenarios where multiple latent features are associated with the observed data and each feature can have multiple appearances or occurrences within each data point. Such data arise naturally when learning visual object recognition systems from unlabelled images. Together with the nonparametric prior we consider a likelihood model that explains the visual appearance and location of local image patches. Inference with this model is carried out using a Markov chain Monte Carlo algorithm.

Journal ArticleDOI
TL;DR: In this paper, the scalar and sample path large deviation principles for a large class of Poisson cluster processes were proved for ergodic Hawkes point processes, and a large deviation principle was provided for the Poisson Hawkes process.
Abstract: In this paper we prove scalar and sample path large deviation principles for a large class of Poisson cluster processes. As a consequence, we provide a large deviation principle for ergodic Hawkes point processes.

Book
08 Aug 2007
TL;DR: In this paper, the authors present several extensions of the most familiar models for count data, the Poisson and negative binomial models, and develop an encompassing model for two well known variants of the NB1 and NB2 forms.
Abstract: This study presents several extensions of the most familiar models for count data, the Poisson and negative binomial models. We develop an encompassing model for two well known variants of the negative binomial model (the NB1 and NB2 forms). We then propose some alternative approaches to the standard log gamma model for introducing heterogeneity into the loglinear conditional means for these models. The lognormal model provides a versatile alternative specification that is more flexible (and more natural) than the log gamma form, and provides a platform for several “two part” extensions, including zero inflation, hurdle and sample selection models. We also resolve some features in Hausman, Hall and Griliches’s (1984) widely used panel data treatments for the Poisson and negative binomial models that appear to conflict with more familiar models of fixed and random effects. Finally, we consider a bivariate Poisson model that is also based on the lognormal heterogeneity model. Two recent applications have used this model. We suggest that the correlation estimated in their model frameworks is an ambiguous measure of the correlation of the variables of interest, and may substantially overstate it. We conclude with a detailed application of the proposed methods using the data employed in one of the two aforementioned bivariate Poisson studies.

Journal ArticleDOI
TL;DR: In this article, a quantitative assessment of the occurrence probability of intense geomagnetic storms (peak Dst 280 nT) is presented, which is based on extreme value modeling, which exhibits more accurate statistics for extreme behavior.
Abstract: [1] A quantitative assessment of the occurrence probability of intense geomagnetic storms (peak Dst 280 nT) storms. The mathematical tool to determine this type of PDF is the extreme value modeling, which exhibits more accurate statistics for extreme behavior. Our results estimate S60 ≈ 589, indicating that the March 1989 storm (the event with the largest ∣Dst∣ in the database) corresponds to an event expected to occur only once every 60 a. The other parameter λt gives the average occurrence rate of storm events. We have tested the null hypothesis that the storm occurrence pattern can be modeled as a Poisson process represented by λt, where different λt exist for the active and quiet periods of the solar cycle. Ordinary χ2 tests of goodness of fit can not reject this hypothesis, except within the periods that include extremely frequent occurrences. The rate λt is approximately 2.3 (0.7) per 3 months in the active (quiet) period. A future practical application of this work is that the resultant Poisson probability will enable us to calculate the expected damage due to storms, which represent potential risks in space activities.

Journal ArticleDOI
11 Dec 2007
TL;DR: In this paper, the Gale-Shapley stable marriage is shown to have a power-law upper bound on the matching distance from a typical point to its partner in a translation-invariant matching scheme.
Abstract: Suppose that red and blue points occur as independent homogeneous Poisson processes in Rd. We investigate translation-invariant schemes for perfectly matching the red points to the blue points. For any such scheme in dimensions d = 1, 2, the matching distance X from a typical point to its partner must have infinite d/2-th moment, while in dimensions d ≥ 3 there exist schemes where X has finite exponential moments. The Gale-Shapley stable marriage is one natural matching scheme, obtained by iteratively matching mutually closest pairs. A principal result of this paper is a power law upper bound on the matching distance X for this scheme. A power law lower bound holds also. In particular, stable marriage is close to optimal (in tail behavior) in d = 1, but far from optimal in d ≥ 3. For the problem of matching Poisson points of a single color to each other, in d = 1 there exist schemes where X has finite exponential moments, but if we insist that the matching is a deterministic factor of the point process then X must have infinite mean.

Journal ArticleDOI
TL;DR: This work suggests four approaches to control out-of-control count processes with Poisson marginals arising in context of statistical quality control using the INAR(1) model.
Abstract: The class of INARMA models is well suited to model the autocorrelation structure of processes with Poisson marginals arising in context of statistical quality control. After reviewing briefly the basic principles and important members of this broad family of models, we concentrate on the INAR(1) model, which is of particular relevance for quality control. We suggest four approaches to control such count processes, and compare their run length performance in a simulation study. Results show that only some of the out-of-control situations considered can be controlled effectively with the discussed control schemes. Copyright © 2007 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, a new proof of correlation estimates for arbitrary moments of the resolvent of random Schrodinger operators on the lattice that generalizes and extends the correlation estimate of Minami for the second moment was given.
Abstract: We give a new proof of correlation estimates for arbitrary moments of the resolvent of random Schrodinger operators on the lattice that generalizes and extends the correlation estimate of Minami for the second moment. We apply this moment bound to obtain a new n-level Wegner-type estimate that measures eigenvalue correlations through an upper bound on the probability that a local Hamiltonian has at least n eigenvalues in a given energy interval. Another consequence of the correlation estimates is that the results on the Poisson statistics of energy level spacing and the simplicity of the eigenvalues in the strong localization regime hold for a wide class of translation-invariant, selfadjoint, lattice operators with decaying off-diagonal terms and random potentials.

Journal ArticleDOI
TL;DR: In this article, an extension of zero-inflated generalized Poisson (ZIGP) regression models for count data is discussed. But the model is not considered in this paper.
Abstract: This paper focuses on an extension of zero-inflated generalized Poisson (ZIGP) regression models for count data. We discuss generalized Poisson (GP) models where dispersion is modelled by an additi...

Journal ArticleDOI
TL;DR: The primary goal of this paper is to introduce a multiple testing-based approach to the problem of selecting hotspots incorporating both the posterior distribution of accident frequency and the posterior Distribution of ranks.

Journal ArticleDOI
TL;DR: Standard Poisson models provide a poor fit for alcohol consumption data from a motivating example, and did not preserve Type-I error rates for the randomized group comparison when the true distribution was over-dispersed Poisson.
Abstract: Alcohol consumption is commonly used as a primary outcome in randomized alcohol treatment studies. The distribution of alcohol consumption is highly skewed, particularly in subjects with alcohol dependence. In this paper, we will consider the use of count models for outcomes in a randomized clinical trial setting. These include the Poisson, over-dispersed Poisson, negative binomial, zero-inflated Poisson and zero-inflated negative binomial. We compare the Type-I error rate of these methods in a series of simulation studies of a randomized clinical trial, and apply the methods to the ASAP (Addressing the Spectrum of Alcohol Problems) trial. Standard Poisson models provide a poor fit for alcohol consumption data from our motivating example, and did not preserve Type-I error rates for the randomized group comparison when the true distribution was over-dispersed Poisson. For the ASAP trial, where the distribution of alcohol consumption featured extensive over-dispersion, there was little indication of significant randomization group differences, except when the standard Poisson model was fit. As with any analysis, it is important to choose appropriate statistical models. In simulation studies and in the motivating example, the standard Poisson was not robust when fit to over-dispersed count data, and did not maintain the appropriate Type-I error rate. To appropriately model alcohol consumption, more flexible count models should be routinely employed.