scispace - formally typeset
Search or ask a question

Showing papers on "Poisson distribution published in 1995"


Journal ArticleDOI
TL;DR: The key idea is to use point estimates based on Poisson models and to develop robust variance estimates that are valid more generally and illustrated on reliability and warranty data.
Abstract: Nelson discussed a method of estimating the cumulative mean function for identically distributed processes of recurrent events. We show that a similar approach can be used with more general models, including regression, The key idea is to use point estimates based on Poisson models and to develop robust variance estimates that are valid more generally. The methods are illustrated on reliability and warranty data.

458 citations


Journal ArticleDOI
TL;DR: In this paper, statistical methods for modeling individual behavior when the endogenous variable is a nonnegative integer are discussed with a focus on specification, estimation, and testing, and an application to labor mobility data illustrates the gain obtained by carefully taking into account the specific structure of the data.
Abstract: . This paper deals with statistical methods for modelling individual behavior when the endogenous variable is a nonnegative integer. Examples are the number of children, the number of job changes or the number of shopping trips in a given period. Several approaches—Poisson, robust Poisson, negative binomial (NEGBIN), NEGBINk, hurdle Poisson, truncated-at-zero Poisson—are discussed with a focus on specification, estimation, and testing. An application to labor mobility data illustrates the gain obtained by carefully taking into account the specific structure of the data.

291 citations


Book
01 Sep 1995
TL;DR: This paper presents a meta-modelling framework for large deviations of random variables, using the Flatto-Hahn-Wright model and Erlang's model as examples.
Abstract: What this is, and what it is not Large deviations of random variables General principles Random walks, branching processes Poisson and related processes Large deviation for processes Freidlin-Wentzell theory Boundary theory Allocating independent subtasks Parallel algorithms: rollback Accelerated simulation the M/M/1 queue The Flatto-Hahn-Wright model Erlang's model The Anick-Mitra-Sondhi model ALOHA Priority queues Flatto's models Analysis and Probability Discrete-space markov processes Calculus of variations Large deviations techniques.

262 citations


Journal ArticleDOI
TL;DR: In this paper, a score test is presented to test whether the number of zeros is too large for a Poisson distribution to fit the data well, when analyzing Poisson-count data sometimes a lot of zero are observed.
Abstract: When analyzing Poisson-count data sometimes a lot of zeros are observed. When there are too many zeros a zero-inflated Poisson distribution can be used. A score test is presented to test whether the number of zeros is too large for a Poisson distribution to fit the data well.

249 citations


Journal ArticleDOI
TL;DR: Individual activated events in slowly fluctuating environments are now accessible to study by single molecule spectroscopies, illustrating the ideas when both the environmental variables relax exponentially or follow a stretched exponential law as in glasses or biomolecules.
Abstract: Individual activated events in slowly fluctuating environments are now accessible to study by single molecule spectroscopies. The statistics of such events should exhibit intermittency and will not always obey the Poisson law. For short times, the high order moments are of the corresponding power of the average survival probability. For long times, the high order moments decay much more slowly than the Poisson statistics indicate. A simple example illustrates the ideas when both the environmental variables relax exponentially or follow a stretched exponential law as in glasses or biomolecules.

191 citations


Journal ArticleDOI
TL;DR: In this paper, the distribution and timing of areal basaltic volcanism are modeled using three nonhomogeneous methods: spatio-temporal nearest neighbor, kernel, and nearest-neighbor kernel.
Abstract: The distribution and timing of areal basaltic volcanism are modeled using three nonhomogeneous methods: spatio-temporal nearest neighbor, kernel, and nearest-neighbor kernel. These models give nonparametric estimates of spatial or spatio-temporal recurrence rate based on the positions and ages of cinder cones and related vent structures and can account for migration and shifts in locus, volcano clustering, and development of regional vent alignments. The three methods are advantageous because (1) recurrence rate and probability maps can be made, facilitating comparison with other geological information; (2) the need to define areas or zones of volcanic activity, required in homogeneous approaches, is eliminated; and (3) the impact of uncertainty in the timing and distribution of individual events is particularly easy to assess. The models are applied to the Yucca Mountain region (YMR), Nevada, the site of a proposed high-level radioactive waste repository. Application of the Hopkins F test, Clark-Evans test, and K function indicates volcanoes cluster in the YMR at the >95% confidence level. Weighted-centroid cluster analysis indicates that Plio-Quaternary volcanoes are distributed in four clusters: three of these clusters include cinder cones formed <1 Ma. Probability of disruption within the 8 km2 area of the proposed repository by formation of a new basaltic vent is calculated to be between 1 × 10−4 and 5 × 10−4 in 104 years (the kernel and nearest-neighbor kernel methods give a maximum probability of 5 × 10−4 in 104 years), assuming regional recurrence rates of 5–10 volcanoes/m.y. An additional finding, illustrating the strength of nonhomogeneous methods, is that maps of the probability of volcanic eruption for the YMR indicate the proposed repository lies on a steep probability gradient: volcanism recurrence rate varies by more than 2 orders of magnitude within 20 km. Insight into this spatial scale of probability variation is a distinct benefit of application of these methods to hazard analysis in areal volcanic fields.

173 citations


Journal ArticleDOI
TL;DR: In this article, econometric methods for the estimation of infrastructure deterioration models and associated transition probabilities from inspection data are presented, based on the Poisson regression model and following directly from the Markovian behavior of infrastructure degradation.
Abstract: Markovian transition probabilities have been used extensively in the field of infrastructure management, to provide forecasts of facility conditions. However, existing approaches used to estimate these transition probabilities from inspection data are mostly ad hoc and suffer from several statistical limitations. In this paper, econometric methods for the estimation of infrastructure deterioration models and associated transition probabilities from inspection data are presented. The first method is based on the Poisson regression model and follows directly from the Markovian behavior of infrastructure deterioration. The negative binomial regression, a generalization of the Poisson model that relaxes the assumption of equality of mean and variance, is also presented. An empirical case study, using a bridge inspection data set from Indiana, demonstrates the capabilities of the two methods.

168 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider explosive Poisson shot noise processes as natural extensions of the classical compound Poisson process and investigate their asymptotic properties under regularity conditions on the moment and covariance functions of the shot noise process.
Abstract: We consider explosive Poisson shot noise processes as natural extensions of the classical compound Poisson process and investigate their asymptotic properties. Our main result is a functional central limit theorem with a self-similar Gaussian limit process which, in the classical case, is Brownian motion. The theorems are derived under regularity conditions on the moment and covariance functions of the shot noise process. The crucial condition is regular variation of the covariance function which implies the self-similarity of the limit process. The model is applied to delay in claim settlement in insurance portfolios. In this context we discuss some specific models and their properties. We also use the asymptotic theory for studying the ruin time and ruin probability for a risk process which is based on the Poisson shot noise process.

152 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a hybrid Poisson/polynomial objective function that uses the exact Poisson log-likelihood for detector measurements with low counts, but use computationally efficient quadratic or cubic approximations for the high-count detector measurements.
Abstract: This paper describes rapidly converging algorithms for computing attenuation maps from Poisson transmission measurements using penalized-likelihood objective functions. We demonstrate that an under-relaxed cyclic coordinate-ascent algorithm converges faster than the convex algorithm of Lange (see ibid., vol.4, no.10, p.1430-1438, 1995), which in turn converges faster than the expectation-maximization (EM) algorithm for transmission tomography. To further reduce computation, one could replace the log-likelihood objective with a quadratic approximation. However, we show with simulations and analysis that the quadratic objective function leads to biased estimates for low-count measurements. Therefore we introduce hybrid Poisson/polynomial objective functions that use the exact Poisson log-likelihood for detector measurements with low counts, but use computationally efficient quadratic or cubic approximations for the high-count detector measurements. We demonstrate that the hybrid objective functions reduce computation time without increasing estimation bias. >

134 citations


Journal ArticleDOI
TL;DR: Modeling of partial volume effect is shown to be useful when one of the materials is present in images mainly as a pixel component and incorporated into finite mixture densities in order to more accurately model the distribution of image pixel values.
Abstract: Statistical models of partial volume effect for systems with various types of noise or pixel value distributions are developed and probability density functions are derived. The models assume either Gaussian system sampling noise or intrinsic material variances with Gaussian or Poisson statistics. In particular, a material can be viewed as having a distinct value that has been corrupted by additive noise either before or after partial volume mixing, or the material could have nondistinct values with a Poisson distribution as might be the case in nuclear medicine images. General forms of the probability density functions are presented for the N material cases and particular forms for two- and three-material cases are derived. These models are incorporated into finite mixture densities in order to more accurately model the distribution of image pixel values. Examples are presented using simulated histograms to demonstrate the efficacy of the models for quantification. Modeling of partial volume effect is shown to be useful when one of the materials is present in images mainly as a pixel component. >

128 citations


Book
07 Aug 1995
TL;DR: In this article, the authors present a model for estimating the likelihood of a given distribution of the probability of a particular distribution in a set of least squares, including Chi-square, t, and F distributions.
Abstract: 1. RANDOM VARIABLES AND THEIR DISTRIBUTION Introduction / Sample Distributions / Distributions / Random Variables / Probability Functions and Density Functions / Distribution Functions and Quantiles / Univariate Transformations / Independence 2. EXPECTATION Introduction / Properties of Expectation / Variance / Weak Law of Large Numbers Simulation and the Monte Carlo Method 3. SPECIAL CONTINUOUS MODELS Gamma and Beta Distributions / The Normal Distribution / Normal Approximation and the Central Limit Theorem 4. SPECIAL DISCRETE MODELS Combinatorics / The Binomial Distribution / The Multinomial Distribution / The Poisson Distribution / The Poisson Process 5. DEPENDENCE Covariance, Linear Prediction, and Correlation / Multivariate Expectation / Covariance and Variance - Covariance Matrices / Multiple Linear Prediction / Multivariate Density Function / Invertible Transformations / The Multivariate Normal Distribution 6. CONDITIONAL DISTRIBUTIONS Sampling Without Replacement / Hypergeometric Distribution / Conditional Density Functions / Conditional Expectation / Prediction / Conditioning and the Multivariate Normal Distribution / Random Parameters 7. NORMAL MODELS Introduction / Chi-Square, t, and F Distribution / Confidence Intervals / The t Test of an Inequality / The t Test of an Equality 8. THE F TEST Introduction to Linear Regression / The Method of Least Squares / Factorial Experiments / Input-Response and Experimental Models 9. LINEAR ANALYSIS Linear Spaces / Identifiability / Saturated Spaces / Inner Products / Orthogonal Projections / Normal Equations 10. LINEAR REGRESSION Least-Square Estimation / Sums of Squares / Distribution Theory / sugar Beet Experiment / Lube Oil Experiment / The t Test / Submodels / The F Test 11. ORTHOGONAL ARRAYS Main Effects / Interactions / Experiments with Factors Having Three Levels / Randomization, Blocking, and Covariates 12. BINOMIAL AND POISSON MODELS Nominal Confidence Intervals and Tests / Exact P-values / One-Parameter Exponential Families 13. LOGISTIC REGRESSION AND POISSON REGRESSION Input-Response and Experimental Models / Maximum-Likelihood Estimation / Existence and Uniqueness of the Maximum-Likelihood Estimate / Interactively Reweighted Least-Squares Method / Normal Approximation / The Likelihood-Ratio Test / APPENDICES: A. PROPERTIES OF VECTORS AND MATRICES / B. SUMMARY OF PROBABILITY / C. SUMMARY OF STATISTICS / D. HINTS AND ANSWERS / E. TABLES / INDEX

Journal ArticleDOI
TL;DR: A strong indication is deduced that a main part of the variability of childhood leukaemia incidence in Great Britain is accounted for by a local neighbourhood 'clustering' structure, which is relatively stable over the 15 year period for the lymphocytic leukaemias.
Abstract: This paper describes an analysis of the geographical variation of childhood leukaemia incidence in Great Britain over a 15 year period in relation to natural radiation (gamma and radon). Data at the level of the 459 district level local authorities in England, Wales and regional districts in Scotland are analysed in two complementary ways: first, by Poisson regressions with the inclusion of environmental covariates and a smooth spatial structure; secondly, by a hierarchical Bayesian model in which extra-Poisson variability is modelled explicitly in terms of spatial and non-spatial components. From this analysis, we deduce a strong indication that a main part of the variability is accounted for by a local neighbourhood 'clustering' structure. This structure is furthermore relatively stable over the 15 year period for the lymphocytic leukaemias which make up the majority of observed cases. We found no evidence of a positive association of childhood leukaemia incidence with outdoor or indoor gamma radiation levels. There is no consistent evidence of any association with radon levels. Indeed, in the Poisson regressions, a significant positive association was only observed for one 5-year period, a result which is not compatible with a stable environmental effect. Moreover, this positive association became clearly non-significant when over-dispersion relative to the Poisson distribution was taken into account.

Journal ArticleDOI
TL;DR: In this article, an analytic study of Poisson's ratio of reentrant foam materials with negative Poisson effect is presented. But the analysis is limited to the case of polyhedral unit cells.
Abstract: This article contains an analytic study of Poisson's ratio of re-entrant foam materials with negative Poisson's ratio. These materials get fatter when stretched and thinner when compressed. The Poisson effect is so fundamentally important to the properties of a material that a large change in the value of the ratio will have significant effects on the material's mechanical performance. Isotropic foam structures with negative Poisson's ratio have been fabricated through a permanent volumetric transformation. The cells were converted from the convex polyhedral shape of conventional foam cells to a concave or "reentrant" shape. Mechanical behavior of a re-entrant open cell foam material will differ from that of a conventional foam in ways not addressed by existing theoretical treatment. Poisson's ratio as a function of strain is obtained by modeling the three-dimensional unit cell as an idealized polyhedron unit cell. Poisson's ratio is predicted to approach the isotropic limit of −1 with increasing permanen...

Journal ArticleDOI
TL;DR: In low temperature limit, an expression for the distribution of charge transmitted over a finite time interval is derived by using a result from the random matrix theory of quasi one dimensional disordered conductors and it is found that the peak of the distribution is Gaussian and shows negligible sample to sample variations.
Abstract: In low temperature limit, we study electron counting statistics of a disordered conductor. We derive an expression for the distribution of charge transmitted over a finite time interval by using a result from the random matrix theory of quasi one dimensional disordered conductors. In the metallic regime, we find that the peak of the distribution is Gaussian and shows negligible sample to sample variations. We also find that the tails of the distribution are neither Gaussian nor Poisson and exhibit strong sample to sample variations.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the errors on counts in cells extracted from galaxy surveys and identified three contributions to the cosmic error: (1) finite volume effect, proportional to the average of the two-point correlation function over the whole survey.
Abstract: We examine the errors on counts in cells extracted from galaxy surveys. The measurement error, related to the finite number of sampling cells, is disentangled from the ``cosmic error'', due to the finiteness of the survey. Using the hierarchical model and assuming locally Poisson behavior, we identified three contributions to the cosmic error: The finite volume effect is proportional to the average of the two-point correlation function over the whole survey. It accounts for possible fluctuations of the density field at scales larger than the sample size. The edge effect is related to the geometry of the survey. It accounts for the fact that objects near the boundary carry less statistical weight than those further away from it. The discreteness effect is due to the fact that the underlying smooth random field is sampled with finite number of objects. This is the ``shot noise'' error. Measurements of errors in artificial hierarchical samples showed excellent agreement with our predictions. The probability distribution of errors is increasingly skewed when the order $N$ and/or the cell size increases. The Gaussian approximation is valid only in the weakly non-linear regime, otherwise it severely underestimates the true errors. We study the concept of ``number of statistically independent cells'' This number is found to depend highly on the statistical object under study and is generally quite different from the number of cells needed to cover the survey volume. In light of these findings, we advocate high oversampling for measurements of counts in cells.

Journal ArticleDOI
TL;DR: In this article, it was shown that the multiplicativity condition for Poisson groupoids can be generalized to the case of Lie bialgebroid morphisms, and an explicit lifting construction was given for a special class of morphisms.
Abstract: Some important properties of Poisson groupoids are discussed. In particular, we obtain a useful formula for the Poisson tensor of an arbitrary Poisson groupoid, which generalizes the well-known multiplicativity condition for Poisson groups. Morphisms between Poisson groupoids and between Lie bialgebroids are also discussed. In particular, for a special class of Lie bialgebroid morphisms, we give an explicit lifting construction. As an application, we prove that a Poisson group action on a Poisson manifold lifts to a Poisson action on its α-simply connected symplectic groupoid.

01 Dec 1995
TL;DR: In this article, the authors examined the errors on counts in cells extracted from galaxy surveys and identified three contributions to the cosmic error: (1) finite volume effect, proportional to the average of the two-point correlation function over the whole survey.
Abstract: We examine the errors on counts in cells extracted from galaxy surveys. The measurement error, related to the finite number of sampling cells, is disentangled from the ``cosmic error'', due to the finiteness of the survey. Using the hierarchical model and assuming locally Poisson behavior, we identified three contributions to the cosmic error: The finite volume effect is proportional to the average of the two-point correlation function over the whole survey. It accounts for possible fluctuations of the density field at scales larger than the sample size. The edge effect is related to the geometry of the survey. It accounts for the fact that objects near the boundary carry less statistical weight than those further away from it. The discreteness effect is due to the fact that the underlying smooth random field is sampled with finite number of objects. This is the ``shot noise'' error. Measurements of errors in artificial hierarchical samples showed excellent agreement with our predictions. The probability distribution of errors is increasingly skewed when the order $N$ and/or the cell size increases. The Gaussian approximation is valid only in the weakly non-linear regime, otherwise it severely underestimates the true errors. We study the concept of ``number of statistically independent cells'' This number is found to depend highly on the statistical object under study and is generally quite different from the number of cells needed to cover the survey volume. In light of these findings, we advocate high oversampling for measurements of counts in cells.

Journal ArticleDOI
TL;DR: In this paper, the first-passage probability of differentiable non-narrow band processes based on higher-order threshold crossings is calculated using factorial moments of the number of crossings into the failure region.

Journal ArticleDOI
TL;DR: In this article, it was shown that the choice of a Poisson structure on a surfaceS canonically determines the construction of a stable sheave on the moduli space of stable sheaves on S. This result generalizes previous results obtained by Mukai and Tyurin.
Abstract: We introduce and study the notion of Poisson surface. We prove that the choice of a Poisson structure on a surfaceS canonically determines a Poisson structure on the moduli space ℳ of stable sheaves onS. This result generalizes previous results obtained by Mukai [14], for abelian orK3 surfaces, and by Tyurin [16].

Journal ArticleDOI
TL;DR: In this paper, the effective Poisson's ratio of elastic solids weakened by porosity and micro-cracks was studied using the Mori-Tanaka mean-field approach as applied to macroscopically isotropic solids containing spheroidal pores.
Abstract: We present a theoretical study of the effective Poisson’s ratio of elastic solids weakened by porosity and microcracks. Explicit expressions of the effective Poisson’s ratio are obtained using the Mori-Tanaka mean-field approach as applied to macroscopically isotropic solids containing randomly distributed and randomly oriented spheroidal pores. We focus on the influence of pore shape and concentration and devote special attention to the limiting cases of spherical, penny-shape, and needle-shape pores. A key result of this study is that the effective Poisson’s ratio depends only on pore concentration, pore shape, and Poisson’s ratio of the bulk solid. In other words, it is independent of any other elastic constants of the bulk solid. Also, the ratio of the shear and bulk moduli behaves similarly. Unlike other elastic constants which monotonically decrease with pore concentration, Poisson’s ratio may increase, decrease, or remain unchanged as a function of pore concentration, depending on the pore shape and Poisson’s ratio of the bulk solid. We discuss ramifications of these findings with regard to the elastic constants of oxide superconductors, especially the bismuth cuprates, which show unusually low Poisson’s ratios. We also discuss these low Poisson’s ratios, including the possibility of negative Poisson’s ratios.

Journal ArticleDOI
TL;DR: In this article, the authors show that the frequencies of big shot events are lower than those expected by random distributions for the first several seconds before and/or after each big event, and that this low-frequency interval, "waiting time", is longer for the shots with larger peaks.
Abstract: X-rays from stellar black hole candidates, such as Cygnus X-1, are known to exhibit rapid time variations. Previously, these variations have seemed to occur at random. At a result, the "shot-noise model," which assumes superpositions of randomly occurring identical shots, has often been utilized to interpret the observed time variabilities. Recent Ginga data of Cyg X-1 display evidence, however, which does not support a Poisson distribution in the time intervals between the shots, i.e., a random temporal distribution of the shots. The observations show that the frequencies of big shot events are lower than those expected by random distributions for the first several seconds before and/or after each big event. Moreover, this low-frequency interval, "waiting time," is longer for the shots with larger peaks. We also find an exponential peak-intensity distribution of X-ray shots. These observational features strongly suggest the presence of numerous reservoirs with different capacities for triggering X-ray fluctuations, a key assumption of the model based on the self-organized criticality.

Book
25 Aug 1995
TL;DR: The second edition of this very successful and authoritative set of tables still benefits from clear typesetting, which makes the figures easy to read and use as mentioned in this paper, and has, however, been improved by the addition of new tables that provide Bayesian confidence limits for the binomial and Poisson distributions, and for the square of the multiple correlation coefficient, which have not been previously available.
Abstract: The second edition of this very successful and authoritative set of tables still benefits from clear typesetting, which makes the figures easy to read and use. It has, however, been improved by the addition of new tables that provide Bayesian confidence limits for the binomial and Poisson distributions, and for the square of the multiple correlation coefficient, which have not been previously available. The intervals are the shortest possible, consistent with the requirement on probability. Great care has been taken to ensure that it is clear just what is being tabulated and how the values may be used; the tables are generally capable of easy interpolation. The book contains all the tables likely to be required for elementary statistical methods in the social, business and natural sciences. It will be an essential aid for teachers, researchers and students in those subjects where statistical analysis is not wholly carried out by computers.

Journal ArticleDOI
TL;DR: For the binomial and Poisson distributions, the distance between the mean and the median is always less than ln 2 as mentioned in this paper, and for the Poisson distribution, it is even less than 2.

Book ChapterDOI
01 Jan 1995
TL;DR: In this paper, the chaotic derivative operator on the canonical Poisson space is proved to coincide with the stochastic integration on predictable processes, and the adjoint of this operator is shown to coincide.
Abstract: We establish a duality formula for the chaotic derivative operator on the canonical Poisson space. The adjoint of this operator is proved to coincide with the stochastic integration on predictable processes.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the Radon-Nikodym cocycles of two quotients of the Poisson boundary are cohomologous iff these quotients coincide.
Abstract: Covering Markov operators are a measure theoretical generalization of both random walks on groups and the Brownian motion on covering manifolds. In this general setup we obtain several results on ergodic properties of their Poisson boundaries, in particular, that the Poisson boundary is always infinite if the deck group is non-amenable, and that the deck group action on the Poisson boundary is amenable. For corecurrent operators we show that the Radon-Nikodym cocycles of two quotients of the Poisson boundary are cohomologous iff these quotients coincide. It implies that the Poisson boundary is either purely non-atomic or trivial, and that the action of any normal subgroup of the deck group on the Poisson boundary is conservative. We show that the Poisson boundary is trivial for any corecurrent covering operator with a nilpotent (or, more generally, hypercentral) deck group. Other applications and examples are discussed.

Journal ArticleDOI
TL;DR: The basic methodology of Poisson regression analysis and its application to clinical research are described and overdispersion, model diagnostics, and sample size issues are discussed.
Abstract: Generalized linear models (GLM) are now widely used in analyzing data from clinical trials and in epidemiological studies. In Poisson regression, which fits in the framework of a GLM, the response variable is a count that follows the Poisson distribution. This article describes the basic methodology of Poisson regression analysis and its application to clinical research. Overdispersion, model diagnostics, and sample size issues are discussed. The methodology is illustrated on a data set from a clinical trial for the treatment of bladder cancer, using a new procedure (PROC GENMOD) in the statistical package SAS.

Journal ArticleDOI
TL;DR: In this article, the authors derived a formula for the probability that a randomly selected n-person matrix game has exactly k equilibria and showed that for all n ≥ 2, this probability converges to e−1/k! as the sizes of the strategy sets of at least two players increase without bound.

Journal ArticleDOI
TL;DR: Intensive field surveys were conducted in eastern Nebraska to determine the frequency distribution model and associated parameters of broadleaf and grass weed seedling populations and the k parameter of the negative binomial distribution was not stable across field sites, and must be estimated at the time of sampling.
Abstract: Intensive field surveys were conducted in eastern Nebraska to determine the frequency distribution model and associated parameters of broadleaf and grass weed seedling populations. The negative binomial distribution consistently fit the data over time (1992 to 1993) and space (fields) for both the inter and intrarow broadleaf and grass weed seed- ling populations. The other distributions tested (Poisson with zeros, Neyman type A, logarithmic with zeros, and Poisson- binomial) did not fit the data as consistently as the negative binomial distribution. Associated with the negative binomial distribution is a k parameter. k is a nonspatial aggregation parameter related to the variance at a given mean value. The k parameter of the negative binomial distribution was con- sistent across weed density for individual weed species in a given field except for foxtail spp. populations. Stability of the k parameter across field sites was assessed using the likeli- hood ratio test. There was no stable or common k value across field sites and years for all weed species populations. The lack of stability in k across field sites is of concern, because this parameter is used extensively in the development of paramet- ric sequential sampling procedures. Because k is not stable across field sites, k must be estimated at the time of sampling. Understanding the variability in k is critical to the develop- ment of parametric sequential sampling strategies and un- derstanding the dynamics of weed species in the field. Nomenclature: Corn, Zea mays L.; soybean, Glycine max L. Merr. Additional index words. Negative binomial distribution, k pa- rameter, likelihood ratio test, sampling strategies.

Journal ArticleDOI
TL;DR: A method for planning the duration of a randomized parallel group study in which the response of interest is a potentially recurrent event, and the frequency properties of two non-parametric tests recently proposed by Lawless and Nadeau for trials based on the above design criteria.
Abstract: This paper describes a method for planning the duration of a randomized parallel group study in which the response of interest is a potentially recurrent event. At the design stage we assume patients accrue at a constant rate, we model events via a homogeneous Poisson process, and we utilize an independent exponential censoring mechanism to reflect loss to follow-up. We derive the appropriate study duration to ensure satisfaction of power requirements for the effect size of interest under a Poisson regression model. An application to a kidney transplant study illustrates the potential savings of the Poisson-based design relative to a design based on the time to the first event. Revised design criteria are also derived to accommodate overdispersed Poisson count data. We examine the frequency properties of two non-parametric tests recently proposed by Lawless and Nadeau for trials based on the above design criteria. In simulation studies involving homogeneous and non-homogeneous Poisson processes they performed well with respect to their type I error rate and power. Results from supplementary simulation studies indicate that these tests are also robust to extra-Poisson variation and to clustering in the event times, making these tests attractive in their generality. We illustrate both tests by application to data from a completed kidney transplant study.

Journal ArticleDOI
TL;DR: A method for estimating the distribution of true rate ratios is applied to a data set of perinatal mortality in 515 small areas in the North West Thames Health Region, England, in the period 1986-1990, rejecting the hypothesis that the true rates are homogeneous.
Abstract: The observed variability between mortality or morbidity rates in epidemiologic studies is partly due to random fluctuations. The same is true for rate ratios relative to reference rates. A method for estimating the distribution of true rate ratios is applied to a data set of perinatal mortality in 515 small areas in the North West Thames Health Region, England, in the period 1986-1990. Combining the random Poisson variability with the assumption that the true rate ratios are drawn from a gamma distribution (a family of positive unimodal distributions) produces a negative binomial log-likelihood for the dispersion parameter of the gamma. The maximum likelihood estimate of this parameter and its confidence interval are then found via direct numerical methods; alternatively, the hypothesis of no heterogeneity is tested by a likelihood ratio. The standardized mortality ratios (SMRs) for the data have an empirical distribution with 5th percentile at 0 and 95th percentile at 1.92, but their true variability, as described by the 5th to 95th percentiles of the fitted gamma distribution, is from 0.72 to 1.32. The likelihood ratio test confirmed this result, rejecting the hypothesis that the true rates are homogeneous (p = 0.015). The method requires only modest computing resources and is useful when assessing the need for more detailed study.