scispace - formally typeset
Search or ask a question

Showing papers on "Poisson distribution published in 1978"


Journal ArticleDOI
TL;DR: In this article, a simple and relatively efficient method for simulating one-dimensional and two-dimensional nonhomogeneous Poisson processes is presented, which is applicable for any rate function and is based on controlled deletion of points in a Poisson process whose rate function dominates the given rate function.
Abstract: : A simple and relatively efficient method for simulating one- dimensional and two-dimensional nonhomogeneous Poisson processes is presented. The method is applicable for any rate function and is based on controlled deletion of points in a Poisson process whose rate function dominates the given rate function. In its simplest implementation, the method obviates the need for numerical integration of the rate function, for ordering of points, and for generation of Poisson variates.

890 citations


Journal ArticleDOI
TL;DR: A host-parasitoid model is presented which is intermediate in complexity between the Nicholson-Bailey model and complicated models for incorporating environmental patchiness and has proved useful in sorting out ideas in the related disciplines of epidemiology and parasitology.
Abstract: SUMMARY (1) A host-parasitoid model is presented which is intermediate in complexity between the Nicholson-Bailey model (in which the parasitoids search independently randomly in a homogenous environment) and complicated models for incorporating environmental patchiness (in which the overall distribution of parasitoid attacks is derived from detailed assumptions about their searching behaviour and about the spatial distribution of the hosts). The model assumes the overall distribution of parasitoid attacks per host to be of negative binomal form. There are consequently three biological parameters: two are the usual parasitoid 'area of discovery', a, and the host 'rate of increase', F; the third is the negative binomial clumping parameter, k. Such intermediate-level models have proved useful in sorting out ideas in the related disciplines of epidemiology and parasitology. (2) Empirical and theoretical arguments for using the negative binomial to give a phenomenological description of the essential consequences of spatial patchiness in models are surveyed. (3) A biological interpretation of the parameter k in host parasitoid models is offered. If the parasitoids be distributed among patches according to some arbitrary distribution which has a coefficient of variation CVp, and if the parasitoid attack distribution within a patch be Poisson, then the ensuing compound distribution can be approximated by a negative binomial which will have the same variance as the exact distribution provided k is identified as k = (I I/CVp)2. (4) Expressions are obtained for the equilibrium values of host and parasitoid popula- tions. These equilibria are stable if, and only if, k < 1; that is, provided there is sufficient clumping. (5) The dynamical effects of parasitoid aggregation in some respects mimic those introduced by mutual interference among parasitoids; the appropriate coefficient of 'psuedo-interference' is calculated.

471 citations


Journal ArticleDOI
TL;DR: In this paper, the cumulative distribution function for normalized annual precipitation is derived in terms of two parameters of the storm sequence, the mean number of storms per year and the order of the gamma distribution.
Abstract: Point precipitation is represented by Poisson arrivals of rectangular intensity pulses that have random depth and duration. By assuming the storm depths to be independent and identically gamma distributed, the cumulative distribution function for normalized annual precipitation is derived in terms of two parameters of the storm sequence, the mean number of storms per year and the order of the gamma distribution. In comparison with long-term observations in a subhumid and an arid climate it is demonstrated that when working with only 5 years of storm observations this method tends to improve the estimate of the variance of the distribution of the normalized annual values over that obtained by conventional hydrologic methods which utilize only the observed annual totals.

354 citations


Journal ArticleDOI
TL;DR: A model for the distribution of protein molecules between the cells in a microbial population during steady-state growth finds that this distribution is very broad, especially for small protein numbers; it is definitely not a Poisson distribution.

296 citations


Journal ArticleDOI
TL;DR: A type of correlated binomial model is proposed for use in certain toxicological experiments with laboratory animals where the outcome of interest is the occurrence of dead or malformed fetuses in a litter.
Abstract: In certain toxicological experiments with laboratory animals, the outcome of interest is the occurrence of dead or malformed fetuses in a litter. Previous investigations have shown that the simple one-parameter binomial and Poisson models generally provide poor fits to this type of binary data. In this paper, a type of correlated binomial model is proposed for use in this situation. First, the model is described in detail and is compared to a beta-binomial model proposed by Williams (1975). These two-parameter models are then contrasted for goodness of fit to some real-life data. Finally, numerical examples are given in which likelihood ratio tests based on these models are employed to assess the significance of treatment-control differences.

249 citations


Journal ArticleDOI
TL;DR: In this paper, some general Poisson limit theorems for the U-statistics of Hoeffding (1948) were applied to tests of clustering or collinearities in plane data; nearest neighbour distances are also considered.
Abstract: Motivated by problems in the analysis of spatial data, we prove some general Poisson limit theorems for the U-statistics of Hoeffding (1948). The theorems are applied to tests of clustering or collinearities in plane data; nearest neighbour distances are also considered.

134 citations


Journal ArticleDOI
TL;DR: In this paper, the authors considered the problem of testing the equality of a set of ordered normal means of a Poisson distribution with respect to a given set of parameters, and showed that the largest type I error probability yields the least favorable.
Abstract: This paper considers likelihood ratio tests for testing hypotheses that a collection of parameters satisfy some order restriction. The first problem considered is to test a hypothesis specifying an order restriction on a collection of means of normal distributions. Equality of the means is the subhypothesis of the null hypothesis which yields the largest type I error probability (i.e., is least favorable). Furthermore, the distribution of $T = -\ln$ (likelihood ratio) is similar to that of a likelihood ratio statistic for testing the equality of a set of ordered normal means. The least favorable status of homogeneity is a consequence of a result that if $X$ is a point and $A$ a closed convex cone in a Hilbert space and if $Z \in A$, then the distance from $X + Z$ to $A$ is no larger than the distance from $X$ to $A$. The results of a Monte Carlo study of the power of the likelihood ratio statistic are discussed. The distribution of $T$ is also shown to serve as the asymptotic distribution for likelihood ratio statistics for testing trend when the sampled distributions belong to an exponential family. An application of this result is given for underlying Poisson distributions.

129 citations



Journal ArticleDOI
TL;DR: In this paper, the authors consider single-server infinite-capacity queueing models in which the arrival process is a non-stationary process with an intensity function ∧(t), t ≧ 0, which is itself a random process.
Abstract: One of the major difficulties in attempting to apply known queueing theory results to real problems is that almost always these results assume a time-stationary Poisson arrival process, whereas in practice the actual process is almost invariably non-stationary. In this paper we consider single-server infinite-capacity queueing models in which the arrival process is a non-stationary process with an intensity function ∧(t), t ≧ 0, which is itself a random process. We suppose that the average value of the intensity function exists and is equal to some constant, call it λ, with probability 1. We make a conjecture to the effect that ‘the closer {∧(t), t ≧ 0} is to the stationary Poisson process with rate λ ' then the smaller is the average customer delay, and then we verify the conjecture in the special case where the arrival process is an interrupted Poisson process.

90 citations


Journal ArticleDOI
TL;DR: In this article, the effect of Poisson mixtures, especially as represented by the negative binomial distribution, on statistical inference is examined and the main finding is that probabilities of rejection are increased, sometimes considerably.
Abstract: SUMMARY Data presented as contingency tables and classified by qualitative or quantitative methods are usually analysed on the basis of a Poisson log linear model. We examine the effect of Poisson mixtures, especially as represented by the negative binomial distribution, on statistical inference. The main finding is that probabilities of rejection are increased, sometimes considerably. In the case of heterogeneous binary data, attention is also given to the problems of analysis implied by Poisson mixtures.

85 citations


Journal ArticleDOI
TL;DR: A simple model of logistic growth with additive Poisson disasters of fixed magnitude is considered and the dependence of the persistence time of a colonizing species on the parameters of the model is discussed.

Journal ArticleDOI
TL;DR: Algorithms are developed that allow considerable savings in computer storage as well as execution speed for fast Poisson solvers for certain applications where data is sparse and the solution is only required at relatively few mesh points.
Abstract: Fast Poisson solvers, which provide the numerical solution of Poisson's equation on regions that permit the separation of variables, have proven very useful in many applications. In certain of these applications the data is sparse and the solution is only required at relatively few mesh points. For such problems this paper develops algorithms that allow considerable savings in computer storage as well as execution speed. Results of numerical experiments are given.

Journal ArticleDOI
TL;DR: In this article, the numerical solution of the Navier equations discretized by finite elements is studied by various forms of pre-conditioned conjugate gradient methods, and the dependence of the number of iterations is examined as a function of Poisson's ratio.

Journal ArticleDOI
TL;DR: In this article, a maximum entropy (ME) restoring formalism was derived under the assumption of zero background and additive noise in the image, which is not consistent with the maximum likelihood and Poisson hypotheses: where the background is high and consequently contributes much noise to the observed image, a restored star is broader and smoother than where the foreground is low.
Abstract: The maximum entropy (ME) restoring formalism has previously been derived under the assumptions of (i) zero background and (ii) additive noise in the image. However, the noise in the signals from many modern image detectors is actually Poisson, i.e., dominated by single-photon statistics. Hence, the noise is no longer additive. Particularly in astronomy, it is often accurate to model the image as being composed of two fundamental Poisson features: (i) a component due to a smoothly varying background image, such as caused by interstellar dust, plus (ii) a superimposed component due to an unknown array of point and line sources (stars, galactic arms, etc.). The latter is termed the “foreground image” since it contains the principal object information sought by the viewer. We include in the background all physical backgrounds, such as the night sky, as well as the mathematical background formed by lower-frequency components of the principal image structure. The role played by the background, which may be separately and easily estimated since it is smooth, is to pointwise modify the known noise statistics in the foreground image according to how strong the background is. Given the estimated background, a maximum-likelihood restoring formula was derived for the foreground image. We applied this approach to some one-dimensional simulations and to some real astronomical imagery. Results are consistent with the maximum-likelihood and Poisson hypotheses: i.e., where the background is high and consequently contributes much noise to the observed image, a restored star is broader and smoother than where the background is low. This nonisoplanatic behavior is desirable since it permits extra resolution only where the noise is sufficiently low to reliably permit it.

Journal ArticleDOI
TL;DR: In this paper, the mean and variance of the number of events in a fixed sampling time for a nonparalyzable dead-time counter are derived for a Poisson input process with a rate that is a known function of time.

Journal ArticleDOI
TL;DR: In this article, the basic formulas for the two-time correlation functions are derived using the Poisson representation method using the Glauber-SudarshanP-representation used in quantum optics.
Abstract: Basic formulas for the two-time correlation functions are derived using the Poisson representation method The formulas for the chemical system in thermodynamic equilibrium are shown to relate directly to the fluctuationdissipation theorems, which may be derived from equilibrium statistical mechanical considerations For nonequilibrium systems, the formulas are shown to be generalizations of these fluctuation-dissipation theorems, but containing an extra term which arises entirely from the nonequilibrium nature of the system These formulas are applied to two representative examples of equilibrium reactions (without spatial diffusion) and to a nonequilibrium chemical reaction model (including the process of spatial diffusion) for which the first two terms in a systematic expansion for the two-time correlation functions are calculated The relation between the Poisson representation method and Glauber-SudarshanP-representation used in quantum optics is discussed

Journal ArticleDOI
TL;DR: In this paper, a modified maximum likelihood estimator was developed for estimating the zero class from a truncated Poisson sample when the available sample size itself is a random variable, and a modified estimator appeared to be best with respect to the chosen criteria.
Abstract: Maximum likelihood estimators and a modified maximum likelihood estimator are developed for estimating the zero class from a truncated Poisson sample when the available sample size itself is a random variable. All the estimators considered here are asymptotically equivalent in the usual sense; hence their asymptotic properties are investigated in some detail theoretically as well as by making use of Monte Carlo experiments. One modified estimator appears to be best with respect to the chosen criteria. An example is given to illustrate the results obtained.

Journal ArticleDOI
TL;DR: In this article, a detailed inquiry reveals that the Poisson distribution can predict almost all the observed variation in the frequency distribution of multiples collected by Merton, and by Ogburn and Thomas.
Abstract: Social determinists have argued that the occurrence of independent discoveries and inventions demonstrates the inevitability of techno-scientific progress. Yet the frequency of such multiples may be adequately predicted by a probabilistic model, especially the Poisson model suggested by Price. A detailed inquiry reveals that the Poisson distribution can predict almost all the observed variation in the frequency distribution of multiples collected by Merton, and by Ogburn and Thomas. This study further indicates that: (a) the number of observed multiples may be greatly underestimated, particularly those involving few independent contributors, (b) discoveries and inventions are not sufficiently probable to avoid a large proportion of total failures, and hence techno-scientific advance is to a large measure indeterminate; (c) chance or 'luck' seems to play such a major part that the 'great genius' theory is no more tenable than the social deterministic theory.

Journal ArticleDOI
TL;DR: Although this study arose in the design of a buffer for digital voice-data systems, the queueing model developed is quite general and may be useful for other industrial applications.
Abstract: A queueing model with finite buffer size, Poisson arrival process, synchronous transmission and server interruptions is studied through a Bernoulli sequence of independent random variables. An integrated digital voice-data system with Synchronous Time-Division Multiplexing (STDM) for voice sources and Poisson arrival process for data messages is considered as an application for this model. The relationships among overflow probabilities, buffer size and expected queueing delay due to buffering for various traffic intensities are obtained. The results of this study are portrayed on graphs and may be used as guide lines for the buffer design in digital voice-data systems. Although this study arose in the design of a buffer for digital voice-data systems, the queueing model developed is quite general and may be useful for other industrial applications.

Journal ArticleDOI
TL;DR: In this article, Tyan and Thomas have given a characterization of a class of bivariate Poisson processes and gave some properties and examples of such processes, which is a special case of a special class of Poisson distributions.


Journal ArticleDOI
TL;DR: In this article, the authors developed two systems of data collecting provided with different controls, and analyzed the results using the Pearson ξ 2 test reflect a Poisson process of the radioactive decay events.

Journal ArticleDOI
TL;DR: In this article, the authors explore models and methods for analyzing such data specific cases are the estimation and testing of ratios and the cross-product ratios, both simple and stratified* We assume the Poisson means are exponential functions of the relevant parameters.
Abstract: The incidence of most diseases is low enough that in. large populations the number of new cases may be considered a Poisson variate. This paper explores models and methods for analyzing such data Specific cases are the estimation and testing of ratios and the cross-product ratios, both simple and stratified* We assume the Poisson means are exponential functions of the relevant parameters. The resulting sets of sufficient statistics are partitioned into a test statistic and a vector of statistics related to the nuisance parameters . The methods derived are based on the conditional distribution of the test statistic given the other sufficient statistics. The analyses of stratified cross-product ratios are seen to be analogues of the noncentral distribution associated with theanalysis of the common odds ratio in several 2×2 tables. The various methods are illustrated in numerical examples involving incidence rates of cancer in two metropolitan areas adjusting for both age and sex.

Proceedings ArticleDOI
01 Jan 1978
TL;DR: A simple and relatively efficient new method for simulation of one-dimensional and two-dimensional non-homogeneous Poisson processes, based on controlled deletion of points in a Poisson process with a rate function that dominates the given rate function is described.
Abstract: The nonhomogeneous Poisson process is a widely used model for a series of events (stochastic point process) in which the “rate” or “intensity” of occurrence of points varies, usually with time. The process has the characteristic properties that the number of points in any finite set of nonoverlapping intervals are mutually independent random variables, and that the number of points in any of these intervals has a Poisson distribution. In this paper we first discuss several general methods for simulation of the one-dimensional non-homogeneous Poisson process; these include time-scale transformation of a homogeneous (rate one) Poisson process via the inverse of the integrated rate function, generation of the individual intervals between points, and generation of a Poisson number of order statistics from a fixed density function.We then state a particular and very efficient method for simulation of nonhomogeneous Poisson processes with log-linear rate function. The method is based on an identity relating the nonhomogeneous Poisson process to the gap statistics from a random number of exponential random variables with suitably chosen parameters. This method can also be used, at the cost of programming complexity and some memory, as the basis for a very efficient technique for simulation of nonhomogeneous Poisson processes with more complicated rate functions such as a log-quadratic rate function.Finally, we describe a simple and relatively efficient new method for simulation of one-dimensional and two-dimensional non-homogeneous Poisson processes. The method is applicable for any given rate function and is based on controlled deletion of points in a Poisson process with a rate function that dominates the given rate function. In its simplest implementation, the method obviates the need for numerical integration of the rate function, for ordering of points, and for generation of Poisson variates. The thinning method is also applicable to the generation of individual intervals between points, as is required in many programs for discrete-event simulations.

Journal ArticleDOI
TL;DR: It is proved that traffic on all exit arcs of a network in equilibrium is Poisson; moreover, the customer streams leaving any exit set are mutually independent.
Abstract: Burke and Reich independently showed that the output of an M/M/1 queue in equilibrium is a Poisson process. Consequently, analysis of series (tandem) exponential servers with a Poisson input stream can be reduced to consideration of a series of M/M/1 queues. This work generalizes the above results to so-called Jackson networks, consisting of exponential servers with mutually independent Poisson exogenous inputs and random customer routings permitting customer feedback. We prove that traffic on all exit arcs of a network in equilibrium is Poisson; moreover, the customer streams leaving any exit set are mutually independent. Here an exit arc is a path from server node i such that a customer moving along the arc cannot return to i; an exit set V is a set of server nodes such that customers departing V can never re-enter V. As a special case, the traffic streams leaving a Jackson network in equilibrium are mutually independent Poisson processes. In contrast, traffic on non-exit arcs is non-Poisson, and indeed...

Journal ArticleDOI
TL;DR: In this paper, the detection of a fluctuating signal in the presence of noise is considered for a doubly stochastic Poisson counting system that is subject to fixed nonparalyzable detector deadtime.
Abstract: The detection of a fluctuating signal in the presence of noise is considered for a doubly stochastic Poisson counting system that is subject to fixed nonparalyzable detector deadtime. Explicit expressions are obtained for the likelihood-ratio detection of a modulated source of arbitrary statistics in the presence of Poisson noise counts. Receiver operating characteristics (ROC curves) are presented for an unmodulated (amplitude-stabilized) source with detector dead-time as a parameter; increasing deadtime causes a decrease in the probability of detection for a fixed false-alarm rate. Probability of error curves are presented for an amplitude-stabilized source, both in the absence of modulation and in the presence of triangular modulation, illustrating the deleterious effects of modulation, noise, and deadtime on receiver performance. Expressions for the average mutual information and channel capacity of the system are obtained and graphically presented for the simple counting receiver and for the maximum-likelihood counting receiver; the channel capacity decreases with decreasing signal level and with increasing deadtime and modulation depth. Representative examples of the appropriate counting distributions are provided. Finally, a maximum-likelihood estimate of the mean signal level is obtained for a simple image detection system with a deadtime-perturbed counting array. An expression for the statistical confidence level of the estimate is also obtained. The results are valid for an arbitrary deadtime-perturbed doubly stochastic Poisson counting system and as such are expected to find application in a broad variety of disciplines including photon counting and lightwave communications, operations research, nuclear particle counting, and neural counting and psychophysics.

Journal Article
TL;DR: The contention in the remainder of the paper is that almost any hospital medical/surgical unit should operate well in excess of that model's widely used occupancy recommen?
Abstract: Minimum cost operation of a hospital requires the correct number of beds to meet the de? mand placed on the facility. An excess of beds results in inflated operating and construction costs, while a bed shortage is unacceptable for a variety of reasons, such as the lack of quality care to the community. Several models have been developed and used to assist planners in finding the correct bed size to meet a given demand. These models are inadequate in an environment where stringent cost containment is second only in importance to quality of care. The pre? vious models, such as the Hill-Burton formu? las and the Poisson approximation, are inad? equate because they are incompatible with contemporary admissions scheduling systems. These systems alter the behavior of the census so that analytical models based upon the Pois? son assumption do not fit the results and, along with normative models such as the Hill Burton program, allow too many beds. The implications of the Poisson assumption will be discussed at length here because the contention in the remainder of the paper is that almost any hospital medical/surgical (M/ S) unit should operate well in excess of that model's widely used occupancy recommen? dations. In most cases this also leads to op? eration in excess of the state Hill-Burton rec? ommendations for medical/surgical units, thus Hill-Burton is rejected as a consequence.1 The Poisson assumption for hospital census was first used in the 1940s,2 and has been ap? plied worldwide to determine the size of hos? pital facilities.3 An appealing feature of the Poisson assumption is the simplicity of the parameters; the census mean is equal to the census variance. The factor which has varied from application to application has been the amount of that variance to allow in sizing a particular facility. The first models used from three to four standard deviations which cor? responded to at least the 99.9 percentage point of the Poisson distribution. Later model build? ers reduced this coefficient. One low value used is 2.06 standard deviations, correspond? ing to the 98 percentage point of the normal approximation to the Poisson distribution:

Journal ArticleDOI
TL;DR: In this paper, it was shown that by using the simplest construction of discrete dipoles, the operation count for solving the Dirichlet problem of Poisson's equation by the capacitance matrix method does not exceed constant timesn 2 logn. n=1/h.
Abstract: It is shown that by using the simplest construction of discrete dipoles, the operation count for solving the Dirichlet problem of Poisson's equation by the capacitance matrix method does not exceed constant timesn 2 logn. n=1/h. Certain first and second order schemes of interpolating boundary conditions are considered.


Journal ArticleDOI
TL;DR: In this paper, the most extreme observation in either the counting or timing mode is considered, where the free parameters are constrained to ranges determined from other experiments (magnitude estimation, reaction time).