scispace - formally typeset
Search or ask a question

Showing papers on "Probability density function published in 1978"


Journal ArticleDOI
TL;DR: In this article, a hybrid density function is given for describing wind-speed distributions having nonzero probability of "calm" and a Weibull probability graph is designed specifically for plotting wind speed distributions.
Abstract: A hybrid density function is given for describing wind-speed distributions having nonzero probability of “calm.” A Weibull probability graph paper designed specifically for plotting wind-speed distributions is used to determine distribution parameters to within a few percent of values obtained by the maximum likelihood technique. Data from the National Weather Service are used to demonstrate the use of the hybrid density function and the Weibull graph paper.

194 citations


Journal ArticleDOI
TL;DR: In this paper, a logistic density transform and a reproducing inner product from the first-order autoregressive stochastic process are employed to represent prior information that the derivative of the transform is unlikely to change radically within small intervals.
Abstract: SUMMARY A method is proposed for the non-parametric estimation of a probability density, based upon a finite number of observations and prior information about the smoothness of the density. A logistic density transform and a reproducing inner product from the first-order autoregressive stochastic process are employed to represent prior information that the derivative of the transform is unlikely to change radically within small intervals. The posterior estimate of the density possesses a continuous second derivative; it typically satisfies the frequentist property of asymptotic consistency. A direct analogy is demonstrated with a smoothing method for the time-dependent Poisson process; this is similar in spirit to the normal theory Kalman filter. A procedure for grouped observations in a histogram provides an alternative to the histospline method of Boneva, Kendall and Stefanov. Five practical examples are presented, including two investigations of normality, an analysis of pedestrian arrivals at a Pelican crossing and a histogram smoothing method for mine explosions data.

185 citations


Journal ArticleDOI
TL;DR: The expected statistical distributions of intercept length are derived in terms of geometrical probability density functions pertaining to plates with known thickness penetrated by lines with random orientation to provide arithmetic and graphical solutions for obtaining distributions of membrane thickness and reciprocal membrane thickness from empirical distribution of intercept lengths.
Abstract: SUMMARY The expected statistical distributions of intercept length are derived in terms of geometrical probability density functions pertaining to plates with known thickness penetrated by lines with random orientation. These expressions provide arithmetic and graphical solutions for obtaining distributions of membrane thickness and reciprocal membrane thickness from empirical distributions of intercept lengths. Furthermore, general relationships between probability density functions of distributions of intercept length and membrane thickness are derived as well as those between their moments. Examples of the application of the method to biological samples are given, and estimated distributions of glomerular basement membrane thickness are compared to those obtained by an independent, direct method. Various sources of bias, which in practice may occur due to departures from the sample model, are discussed and the influence of some of them is estimated. The knowledge of the probability density function of reciprocal intercepts makes it possible to perform a correction of the distributions of measured intercept length, which to some extent eliminates bias.

129 citations


Journal ArticleDOI
TL;DR: A model is proposed that permits the prediction of contrast detection thresholds for arbitrary spatial patterns and an explanation is offered for certain invariance properties of spatial contrast detection that seems to possess promising generality.
Abstract: A model is proposed that permits the prediction of contrast detection thresholds for arbitrary spatial patterns. The influence of the inhomogeneous structure of the visual field and a form of spatial integration are incorporated in the model. A hypothetical density function for the spatial sampling units, which specifies the distribution of these units with respect to both size and location, is described. The density function is compared with anatomical and electrophysiological knowledge of the density of retinal and cortical receptive fields. This density function permits a particularly lucid interpretation in terms of pattern processing. It can be considered as a system that permits simultaneous global and focal views of the surroundings. The density function, together with a schematized adaptation behaviour of single units, and an incoherent summation rule permit us to calculate a measure of the mass response, and consequently the threshold function. Predictions of the model are compared with recently obtained psychophysical data. In particular an explanation is offered for certain invariance properties of spatial contrast detection that seems to possess promising generality.

114 citations


Journal ArticleDOI
TL;DR: In this article, a method for estimating definite and indefinite integrals over a density function, such as a local density of states, defined by a three-term recurrence relation, is presented.
Abstract: A method is presented for estimating definite and indefinite integrals over a density function, such as a local density of states, defined by a three-term recurrence relation. This may be generated, for example, by the 'recursion method' applied to some Hamiltonian, and properties of the approximation are given, and the results derived, in that context.

114 citations


Journal ArticleDOI
TL;DR: In this paper, the Philip infiltration equation is integrated over the duration of a rainstorm of uniform intensity to give the depth of point surface runoff from such an event on a natural surface in terms of random variables defining the initial soil moisture, the rainfall intensity, and the storm duration.
Abstract: The Philip infiltration equation is integrated over the duration of a rainstorm of uniform intensity to give the depth of point surface runoff from such an event on a natural surface in terms of random variables defining the initial soil moisture, the rainfall intensity, and the storm duration. In a zeroth-order approximation the initial soil moisture is fixed at its climatic space and time average, whereupon by using exponential probability density functions for storm intensity and duration, the probability density function of point storm rainfall excess is derived. This distribution is used to define the annual average depth of point surface runoff and to derive the flood volume frequency relation, both in terms of a set of physically meaningful climate-soil parameters.

110 citations


Journal ArticleDOI
TL;DR: In this paper, a generalized steepest-ascent (deflected-gradient) type of iterative procedure is introduced, which is just the procedure known in the literature when the step-size is taken to be 1 with probability 1 as the sample size grows large.
Abstract: The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1 With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2 The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2

98 citations


Journal ArticleDOI
TL;DR: The analysis provides a mathematical foundation for the current practice of superimposing a probability distribution function on a biological time scale to describe the development of individuals from a population.

82 citations


Journal ArticleDOI
TL;DR: In this paper, a detailed accuracy analysis is presented for moments, up to order four, of both velocity (horizontal u and vertical w) and scalar (temperature θ and humidity q) fluctuations, as well as of the products uw, wθ and wq, in the atmospheric surface layer.
Abstract: A detailed accuracy analysis is presented for moments, up to order four, of both velocity (horizontal u and vertical w) and scalar (temperature θ and humidity q) fluctuations, as well as of the products uw, wθ and wq, in the atmospheric surface layer. The high-order moments and integral time scales required for this analysis are evaluated from data obtained at a height of about 5 m above the ocean surface under stability conditions corresponding to Z/L \- −0.05. Measured moments and probability density functions of some of the individual fluctuations show departures from Gaussianity, but these are sufficiently small to enable good estimates to be obtained using Gaussian instead of measured moments. For the products, the assumption of joint Gaussianity for individual fluctuations provides a reasonable, though somewhat conservative, estimate for the integration times required. The concept of Reynolds number similarity implies that differences in integration time requirements for flows at different Reynolds numbers arise exclusively from differences in integral time scales. A first approximation to the integral time scales relevant to atmospheric flows is presented.

72 citations


Journal ArticleDOI
TL;DR: Because of the rules used to delineate census tracts, unweighted estimation of an urban population density function using census tract observations leads to a severe upward bias in the estimated function, and a weighted estimation procedure which leads to an unbiased estimate is proposed.

64 citations


Proceedings ArticleDOI
29 Jan 1978
TL;DR: In this paper, the transformation of waves crossing a coral reef in Hawaii including the probability density function of the wave heights and periods and the shape of the spectrum is discussed using spectral analysis and the zero up-crossing procedure.
Abstract: The transformation of waves crossing a coral reef in Hawaii including the probability density function of the wave heights and periods and the shape of the spectrum is discussed. The energy attenuation and the change of height and period statistics is examined using spectral analysis and the zero up-crossing procedure. Measurements of waves at seven points along a 1650 ft transect in depths from 1 to 3.5 ft on the reef and 35 ft offshore were made. The heights were tested for Rayleigh, truncated Rayleigh and Wei bull distributions. A symmetrical distribution presented by Longuet-Higgins (1975) and the Weibull distribution were compared to the wave period density function. In both cases the Weibull probability density function fitted with a high degree of correlation. Simple procedures to obtain Weibull coefficients are given. Fourier spectra were generated and contours of cumulative energy against each position on the reef show the shifting of energy from the peak as the waves move into shallow water. A design spectrum, with the shape of the Weibull distribution, is presented with procedures given to obtain the coefficients which govern the distribution peakedness. Normalized non-dimensional frequency and period spectra were recommended for engineering applications for both reef and offshore locations. A zero up-crossing spectrum (ZUS) constructed from the zero upcrossing heights and periods is defined and compared with the Fourier spectrum. Also discussed are the benefits and disadvantages of the ZUS, particularly for non-linear wave environments in shallow water. Both the ZUS and Fourier spectra are used to test the adequacy of formulae which estimate individual wave parameters. Cross spectra analysis was made to obtain gain function and squared coherency for time series between two adjacent positions. It was found that the squared coherency is close to unity near the peak frequency. This means that the output time series can be predicted from the input by applying the gain function. However, the squared coherency was extremely small for other frequencies above 0.25 H2.

Journal ArticleDOI
TL;DR: These results are used to analyze recordings of singleunit activity in the eight cranial nerve and describe the conditions under which a PST histogram can serve as an unbiased estimate of the ensemble average of a spike train's intensity and an interval histograms can serveAs an unbiased estimates of the probability density function of the interspike intervals.

Book ChapterDOI
01 Jan 1978

Journal ArticleDOI
TL;DR: The optimum (minimum mean-squared-error criterion) and optimum uniform quantizer characteristics for signals characterized by the Laplacian amplitude probability density function are given in tabular form.
Abstract: The optimum (minimum mean-squared-error criterion) and optimum uniform quantizer characteristics for signals characterized by the Laplacian amplitude probability density function are given in tabular form. These results correct and extend previously published results.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the role of molecular diffusivity in the evolution of the r.m.s. temperature fluctuation in a specially constructed wind tunnel and found that about 60% of the heat transport is accomplished by the low wavenumber components having length scales equal to or larger than the integral scale.
Abstract: Turbulence produced by a grid which simultaneously imparts a mean temperature profile varying linearly with height was investigated in a specially constructed wind tunnel. While the mean temperature profile is preserved downstream of the grid in accordance with the theory of Corrsin (1952), the downstream evolution of the r.m.s. temperature fluctuation is at variance with his prediction. The reason for this discrepancy is shown to lie in the neglect of molecular diffusivity, which leads to unbounded growth of the fluctuations. Along with conventional correlations and spectra, the filtered heat-transfer correlation is presented. About 60% of the heat transport is accomplished by the low wavenumber components having length scales equal to or larger than the integral scale. An intriguing feature of the present experiments is the presence of an inertial-convective subrange for the temperature field notwithstanding the low Reynolds number and the consequent absence of an inertial subrange for the velocity field. Experimental results show that the temperature has a positive skewness everywhere in contrast to the velocity components, which are symmetrically distributed. Measurements of the joint probability density function of the vertical component of the velocity and the temperature indicate that, while the assumption of joint normality is not uniformly valid, the conditional expectations nearly follow the normal law. Marginal and joint moments of up to fourth order are presented. Odd-order joint moments are clearly sensitive to the skewness of the temperature.

Journal ArticleDOI
TL;DR: In this article, the moments of C (moments of coverage) are found by solving a recursive integral equation, and a formula is derived for the cumulative distribution function, which is related to the exponential distribution.
Abstract: : Place n arcs of equal lengths randomly on the circumference of a circle, and let C denote the proportion covered. The moments of C (moments of coverage) are found by solving a recursive integral equation, and a formula is derived for the cumulative distribution function. The asymptotic distribution of C for large n is explored, and is shown to be related to the exponential distribution.

Journal ArticleDOI
J. Limb1, C. Rubinstein
TL;DR: An alternative approach to determining the visibility function is considered that obviates the need for repeated picture-dependent subjective tests and the role of probability is found to be weaker where the viewer has more opportunity to scrutinize the picture.
Abstract: Visibility functions measure the relative visibility of noise added to a picture at those points where some measure of local activity exceeds a given threshold. The functions are obtained from a series of subjective experiments and vary with the content of the picture. Visibility functions have been used to design quantizing characteristics for DPCM coding of monochrome and color signals and for three-dimensional transform coding. We consider an alternative approach to determining the visibility function that obviates the need for repeated picture-dependent subjective tests. The visibility function is assumed to consist of two parts, a picture-dependent component and viewer-dependent component (referred to as the masking function). The visibility function may be approximated by the quotient of a probability density function raised to a power and the masking function. The role of probability is found to be weaker where the viewer has more opportunity to scrutinize the picture.

Journal ArticleDOI
TL;DR: In this article, a technique for simultaneous measurement of the local number and velocity probability densities of a dilute two-phase suspension which has a distribution of particle sizes and a predominate direction of flow orientation is presented.

Journal ArticleDOI
TL;DR: In this paper, the empirical density function, a simple modification and improvement of the usual histogram, is defined and its properties are studied, and an analysis is presented which enables the interval width to be chosen.
Abstract: The empirical density function, a simple modification and improvement of the usual histogram, is defined and its properties are studied. An analysis is presented which enables the interval width to be chosen. The estimators are modified for the important practical case of bounded random variables. Finally, the problems of writing a programme to compute the functions are considered along with some Monte Carlo examples and a practical example from the National Uranium Resource Evaluation study conducted by the United States Energy Research and Development Administration. It is recommended that these techniques be introduced at all levels of statistical courses so that they will become more widely utilized. † Copyright U.S. Government. This work was carried out under the auspices of the United States Energy Research and Development Administration under contract W–

Journal ArticleDOI
TL;DR: In this paper, the nonstationary random vibration of a lightly damped linear structure subjected to white noise is considered and it is shown that the probability density function of the amplitude of the structural response can be approximated by a Rayleigh distribution.

Journal ArticleDOI
TL;DR: In this article, statistical characteristics of speckle fields propagated through the turbulent atmosphere were investigated and the results of measurements of the probability distribution function and their moments, and the spatial covariance function were compared with a recent theoretical treatment of the problem.
Abstract: Statistical characteristics of speckle fields propagated through the turbulent atmosphere are experimentally investigated. The results of measurements of the probability distribution function and their moments, and the spatial covariance function are compared with a recent theoretical treatment of the problem. The results are in good agreement in weak turbulence conditions, while the stronger turbulence results differ significantly from the theory and suggest the need for additional analytical work.

Journal ArticleDOI
TL;DR: In this article, it was shown that the random walk theory of Gissler and Rother is equivalent to a master equation with jumps to further neighbor sites, and that the theory may be applied to any lattice type with a general time probability distribution for jumps.

Journal ArticleDOI
TL;DR: In this paper, the probability density of spacings between neighboring vortices is used to measure the probability of vortex merging in a two-dimensional mixing layer, where the mixing layer is assumed to be statistically uniform along the streamwise coordinate and not to change with time.
Abstract: The phenomenon of vortex merging in a two‐dimensional mixing layer is treated statistically by introducing the probability density of spacings between neighboring vortices. The rate of merging is the other statistical variable and is a function of the spacing. The mixing layer is assumed to be statistically uniform along the streamwise coordinate and not to change with time. The following further assumptions were made: no correlation between adjacent spacings, similarity of the probability density, etc., to all times, and that the rate of merging for a single pair of vortices is inversely proportional to the square of their distance. The probability density was obtained by solving the governing equation. The result agrees well with that data measured by Brown and Roshko.

Journal ArticleDOI
TL;DR: In this paper, the amplitude probability density characteristic of a stretched string subject to a random external force was analyzed by means of Stratonovich's quasi-static averaging method and the transition probability density function was estimated by a polynomial approximation.

Journal ArticleDOI
TL;DR: In this paper, a new approach is presented for analyzing probabilistic cash flow profiles, where Integral Transform Theory (IT) is used to obtain a complete analytic characterization of the probability density function of the net present worth of such profiles.
Abstract: A new approach is presented for analyzing probabilistic cash flow profiles. Integral transform theory is used to obtain a complete analytic characterization of the probability density function of the net present worth of such cash flow profiles. This represents an extension of the current techniques of risk analysis. The methodology includes the usual evaluation and use of the expected value and the second, third and fourth central moments of the density of present worth. However, these descriptive measures do not always provide sufficient information for a complete managerial analysis. The method presented here enables management to fully differentiate between competing investment alternatives using existing methods, such as stochastic dominance, which are based on a knowledge of the form of the associated probability density functions. Simple formulae for evaluating moments of any or all orders are given. Illustrative examples are included to accompany the analytic development for several repre...

Journal ArticleDOI
TL;DR: The time-dependent distribution of the number of cycle slips in positive and negative directions, and the correlation of their time spacings, are derived from a new statistical model of an (N + l) -order phase tracking system.
Abstract: The time-dependent distribution of the number of cycle slips in positive and negative directions, and the correlation of their time spacings, are derived from a new statistical model of an (N + l) -order phase tracking system. The probability density of the phase error and the other system variables are shown to agree with known results. Relations for the steady state are obtained in a relatively simple form. Some limiting conditions are mentioned under which the model reduces to a computationally much simpler renewal model described earlier.

Journal ArticleDOI
TL;DR: In Part 2, experimental results are presented that verify the density functions developed here, and the effects of lognormal amplitude fluctuations and Gaussian phase perturbations, in addition to local oscillator shot noise, are considered for both passive receivers and those employing active tilt-tracking systems to eliminate angle-of-arrival fluctuations.
Abstract: Approximate expressions are derived for the probability density functions of the i.f. signal magnitudes from optical heterodyne detection systems operating in the presence of clear air turbulence. The effects of log-normal amplitude fluctuations and Gaussian phase perturbations, in addition to local oscillator shot noise, are considered for both passive receivers and those employing active tilt-tracking systems to eliminate angle-of-arrival fluctuations. In Part 2, experimental results are presented that verify the density functions developed here.

Journal ArticleDOI
TL;DR: The ML number-parameter estimation theory is put into the form of an efficient algorithm which proves to be superior when compared to other processing methods such as Fourier maps.
Abstract: The optimum estimation of the number, directions, and strengths of multiple point radio sources is considered when the mutual coherence function of the sources' radiation is spatially sampled at M baselines by a variable baseline correlation interferometer. The measurements are corrupted by the effects of additive background noise (including receiver noise) and a finite correlation time. Statistically approached, the problem is considered as a combination of parameter estimation and goodness of fit with the maximum likelihood (ML) principle being the basic criterion used. First the measurements' probability density function is derived, assuming the sources' number is known. Then the ML estimator (MLE) of the sources' parameters is obtained. The MLE's asymptotic optimum performance (unbiasedness with minimum variance) is then shown to be achieved when the number of measurements exceeds the number of sources by a threshold that is small (or zero) for most signal-to-noise ratios of interest. Next the number of sources is estimated according to a likelihood probability that measures the tenability of the MLE associated with every possible number of sources with respect to the measurements. The ML number-parameter estimation theory is then put into the form of an efficient algorithm which proves to be superior when compared to other processing methods such as Fourier maps.

Journal ArticleDOI
TL;DR: In this article, the second-order statistics for speckle patterns are derived through the use of an integral equation which determines the moment generating function, and the inversion problem of determining the field correlation function from measurements of the integrated intensities is examined in the context of singular-value decomposition.
Abstract: The second-order statistics for speckle patterns are derived through the use of an integral equation which determines the moment generating function. A specific geometry of apertures is treated as an example of the techniques developed. The inversion problem of determining the field correlation function from measurements of the integrated intensities is examined in the context of singular-value decomposition. The joint probability density function for the integrated intensities is evaluated.

Journal ArticleDOI
TL;DR: In this paper, angular momentum and orientational probability density functions are derived for independent dynamical variables of a general system under the action of (non-markovian) gaussian force or torque.
Abstract: Starting from the linear Mori-Kubo generalized Langevin equation, conditional probability density functions are derived for independent dynamical variables of a general system under the action of (non-markovian) gaussian force or torque. The special techniques developed by Adelman (1976, 1977) for brownian oscillators are here extended to the general case. As applications of the general theory, angular momentum and orientational probability density functions are computed for models of molecular rotation in fluids. The rotational dynamics of each molecule is described by an itinerant oscillator/librator. The periodicity of spatial orientation is treated in terms of wrapped distributions, and hence the planar-reorientational counterpart of the translational self van Hove function is derived. It is shown that both momentum and orientational probability density functions can exhibit widely varying time-decay properties, which are directly interpretable in terms of the structure and dynamics of the molecular f...