scispace - formally typeset
Search or ask a question

Showing papers on "Probability density function published in 1987"


Journal ArticleDOI
TL;DR: The properties of a set of even-order tensors, used to describe the probability distribution function of fiber orientation in suspensions and composites containing short rigid fibers, are reviewed in this paper.
Abstract: The properties of a set of even‐order tensors, used to describe the probability distribution function of fiber orientation in suspensions and composites containing short rigid fibers, are reviewed These tensors are related to the coefficients of a Fourier series expansion of the probability distribution function If an n‐th‐order tensor property of a composite can be found from a linear average of a transversely isotropic tensor over the distribution function, then predicting that property only requires knowledge of the n‐th‐order orientation tensor Equations of change for the second‐ and fourth‐order tensors are derived; these can be used to predict the orientation of fibers by flow during processing A closure approximation is required in the equations of change A hybrid closure approximation, combining previous linear and quadratic forms, performs best in the equations of change for planar orientation The accuracy of closure approximations is also explored by calculating the mechanical properties o

1,460 citations


Journal ArticleDOI
B.J. Worton1
TL;DR: A comparison of the properties of all the models for home range considered here shows that both the Fourier Transform method proposed by D.J. Anderson and the kernel method are good because of their flexibility.

673 citations


Journal ArticleDOI
TL;DR: An expression for the output of the receiver is obtained for the case of random signature sequences, and the corresponding characteristic function is determined to study the density function of the multiple-access interference and to determine arbitrarily tight upper and lower bounds on the average probability of error.
Abstract: Binary direct-sequence spread-spectrum multiple-access communications, an additive white Gaussian noise channel, and a coherent correlation receiver are considered. An expression for the output of the receiver is obtained for the case of random signature sequences, and the corresponding characteristic function is determined. The expression is used to study the density function of the multiple-access interference and to determine arbitrarily tight upper and lower bounds on the average probability of error. The bounds, which are obtained without making a Gaussian approximation, are compared to results obtained using a Gaussian approximation. The effects of transmitter power, the length of the signature sequences, and the number of interfering transmitters are illustrated. Each transmitter is assumed to have the same power, although the general approach can accommodate the case of transmitters with unequal powers.

402 citations


DOI
01 Apr 1987
TL;DR: In this article, a statistical characterisation of clutter as a complex random process is needed in the design of optimum detection schemes, and the model is modeled as a spherically invariant random process (SIRP), assuming that its PDFs can be expressed as non-negative definite quadratic forms, a generalisation of a Gaussian process.
Abstract: A statistical characterisation of clutter as a complex random process is needed in the design of optimum detection schemes. The paper considers modelling complex clutter as a spherically invariant random process (SIRP), namely assuming that its PDFs can be expressed as non-negative definite quadratic forms, a generalisation of a Gaussian process. Relevant properties of SIRPs are summarised, and shown to comply with basic requirements such as circular symmetry of the joint PDF of the in-quadrature components or, equivalently, the uniformity of the phase distribution. A constraint of admissibility must be imposed on the envelope distribution, but most commonly used envelope distributions, including Weibull, contaminated Rayleigh and K-distribution are shown to be admissible. Although a general SIRP is not ergodic, a characterisation of the clutter process as an SIRP scanned in the ensemble is finally proposed, which restores ergodicity. The interpretation of this model in the light of already proposed composite scattering models is also discussed.

330 citations


Journal ArticleDOI
TL;DR: A physical approach is proposed, which provides lifetime distribution functions for the tryptophan decay in proteins, and it is shown that real lifetime distributions can be proved using multimodal probability density functions.

277 citations


Journal ArticleDOI
TL;DR: In this paper, kernel density estimators are used for the estimation of integrals of various squared derivatives of a probability density, and rates of convergence in mean squared error are calculated, which show that appropriate values of the smoothing parameter are much smaller than those for ordinary density estimation.

257 citations


Journal ArticleDOI
TL;DR: Nine pictorial displays for communicating quantitative information about the value of an uncertain quantity, x, were evaluated for their ability to communicate xI, p(x > a) and p(b > x> a) to well†educated semi†and nontechnical subjects.
Abstract: Nine pictorial displays for communicating quantitative information about the value of an uncertain quantity, x, were evaluated for their ability to communicate 2, p(x > a) and p( b > x > a) to well-educated semi- and nontechnical subjects. Different displays performed best in different applications. Cumulative distribution functions alone can severely mislead some subjects in estimating the mean. A “rusty” knowledge of statistics did not improve performance, and even people with a good basic knowledge of statistics did not perform as well as one would like. Until further experiments are performed, the authors recommend the use of a cumulative distribution function plotted directly above a probability density function with the same horizontal scale, and with the location of the mean clearly marked on both curves.

168 citations


Journal ArticleDOI
TL;DR: In this article, the notion of crossproduct ratio for discrete two-way contingency tables is extended to the case of continuous bivariate densities, which results in the local dependence function that measues the margin-free dependence between bivariate random variables.
Abstract: The notion of cross-product ratio for discrete two-way contingency table is extended to the case of continuous bivariate densities. This results in the “local dependence function” that measues the margin-free dependence between bivariate random variables. Properties and examples of the dependence function are discussed. The bivariate normal density plays a special role since it has constant dependence. Continuous bivariate densities can be constructed by specifying the dependence function along with two marginals in analogy to the construction of two-way contingency tables given marginals and patterns of interaction. The dependence function provides a partial ordering on bivariate dependence.

153 citations


Journal ArticleDOI
TL;DR: The log-normal Rician probability density function as discussed by the authors is based on the following paradigm for the optical field after propagation through atmospheric turbulence: a field with reduced coherence that obeys Rice-Nakagami statistics is modulated by a multiplicative factor which obeys log normal statistics.
Abstract: The log-normal Rician probability-density function is based on the following paradigm for the optical field after propagation through atmospheric turbulence: a field with reduced coherence that obeys Rice–Nakagami statistics is modulated by a multiplicative factor that obeys log-normal statistics. The larger eddies in the turbulent medium produce the log-normal statistics, and the smaller ones produce the Gaussian statistics. On the basis of this model all the parameters required by the density function can be calculated by using physical parameters such as turbulence strength, inner scale, and propagation configuration. The heuristic density function is consistent with available data at low and at high turbulence levels.

151 citations


Journal ArticleDOI
TL;DR: In this paper, spherically invariant random processes (SIRPs) are used as stationary models for speech signals in telephone channels and a comprehensive mathematical treatment is achieved by means of Meijer's G -functions.

143 citations


Journal ArticleDOI
TL;DR: In this paper, the distribution of error eigenvalues resulting from principal component analysis is deduced by considering the decomposition of an error matrix in which the errors are uniformly distributed.
Abstract: The distribution of error eigenvalues resulting from principal component analysis is deduced by considering the decomposition of an error matrix in which the errors are uniformly distributed. The derived probability function is Where λ0j is the jth error eigenvalue, r and c are the numbers of rows and columns in the data matrix, and N is the normalization constant. This expression is tested and validated by investigations involving model data. The distribution function is used to determine the number of factors responsible for various sets of spectroscopic data taken from the chemical literature (including nuclear magnetic resonance, infrared and mass spectra).

Journal ArticleDOI
TL;DR: In this article, a comparison of the Kullback-Leibler and the least-squares cross-validation methods of smoothing parameter selection is made for nonparametric multivariate density estimation.
Abstract: In the setting of nonparametric multivariate density estimation, theorems are established which allow a comparison of the Kullback-Leibler and the least-squares cross-validation methods of smoothing parameter selection The family of delta sequence estimators (including kemel, orthogonal series, histogram and histospline estimators) is considered These theorems also show that either type of cross validation can be used to compare different estimators (eg, kernel versus orthogonal series) 1 Introduction Consider the problem of trying to estimate a d-dimensional probability density function, f(x), using a random sample, Xl,, Xn, from f Most proposed estimators of f depend on a "smoothing parameter," say X E DR +, whose selection is crucial to the performance of the estimator In this paper, for the large class of delta sequence estimators, theorems are obtained which allow comparison of two smoothing parameter selectors which are known to be asymptotically optimal An important consequence of these results is that either smoothing parameter selector may be used for a data based comparison of two density estimators, for example, kernel versus orthogonal series Another attractive feature of these results is that they are set in a quite general framework, special cases of which provide simpler proofs of several recent asymptotic optimality results In Sections 2 and 3 the family of delta sequence estimators and the smoothing parameter selectors are given The theorems are stated in Section 4, with some remarks in Section 5 The rest of the paper consists of proofs

Journal ArticleDOI
Aristotelis Mantoglou1
TL;DR: The spectral turning bands method (STBM) as discussed by the authors can be used to simulate many classes of multidimensional stochastic processes, such as anisotropic two-dimensional processes and spatial averaged processes.
Abstract: The space domain version of the turning bands method can simulate multidimensional stochastic processes (random fields) having particular forms of covariance functions. To alleviate this limitation a spectral representation of the turning bands method in the two-dimensional case has shown that the spectral approach allows simulation of isotropic two-dimensional processes having any covariance or spectral density function. The present paper extends the spectral turning bands method (STBM) even further for simulation of much more general classes of multidimensional stochastic processes. Particular extensions include: (i) simulation of three-dimensional processes using STBM, (ii) simulation of anisotropic two- or three-dimensional stochastic processes, (iii) simulation of multivariate stochastic processes, and (iv) simulation of spatial averaged (integrated) processes. The turning bands method transforms the multidimensional simulation problem into a sum of a series of one-dimensional simulations. Explicit and simple expressions relating the cross-spectral density functions of the one-dimensional processes to the cross-spectral density function of the multidimensional process are derived. Using such expressions the one-dimensional processes can be simulated using a simple one-dimensional spectral method. Examples illustrating that the spectral turning bands method preserves the theoretical statistics are presented. The spectral turning bands method is inexpensive in terms of computer time compared to other multidimensional simulation methods. In fact, the cost of the turning bands method grows as the square root or the cubic root of the number of points simulated in the discretized random field, in the two- or three-dimensional case, respectively, whereas the cost of other multidimensional methods grows linearly with the number of simulated points. The spectral turning bands method currently is being used in hydrologic applications. This method is also applicable to other fields where multidimensional simulations are needed, e.g., mining, oil reservoir modeling, geophysics, remote sensing, etc.

Journal ArticleDOI
TL;DR: In this article, the probability of a brittle crack formation in an elastic solid with fluctuating strength is considered, and a set of all possible crack trajectories reflecting the fluctuation of the strength field is introduced.
Abstract: Probability of a brittle crack formation in an elastic solid with fluctuating strength is considered. A set Omega of all possible crack trajectories reflecting the fluctuation of the strength field is introduced. The probability P(X) that crack penetration depth exceeds X is expressed as a functional integral over Omega of a conditional probability of the same event taking place along a particular path. Various techniques are considered to evaluate the integral. Under rather nonrestrictive assumptions, the integral is reduced to solving a diffusion-type equation. A new characteristic of fracture process, 'crack diffusion coefficient', is introduced. An illustrative example is then considered where the integration is reduced to solving an ordinary differential equation. The effect of the crack diffusion coefficient and of the magnitude of strength fluctuations on probability density of crack penetration depth is presented. Practical implications of the proposed model are discussed.

Journal ArticleDOI
Sverre Haver1
TL;DR: In this article, the adequacy of a simple theoretical model for the joint distribution under stationary conditions is investigated, using measurements achieved during storms in Northern North Sea, in a slightly modified form the model is found to be of a reasonable accuracy as far as the highest waves are of interest.

Journal ArticleDOI
TL;DR: In this article, it is shown that the generalized cumulative distribution curves of Liu and Jordan are not suitable for tropical locations and the problem of estimating the value of the maximum clearness index has also been addressed and a simple model is proposed to evaluate it.

Journal ArticleDOI
TL;DR: In this paper, the authors considered the recursive estimation of a vector-valued stationary process with a first-order univariate probability density f on a road, and established sharp rates for the almost sure convergence of kernel-type estimators.

Journal ArticleDOI
TL;DR: In this article, a statistical analysis of unidirectional non-linear random waves is presented which is based on second-order random wave theory, and the analysis technique is similar to a method which is available for the statistical analysis for the two-term Volterra series.

Journal ArticleDOI
TL;DR: In this article, a generalized probabilistic model of harmonic current injection and propagation is presented, which takes into account random variations of both the operating modes as well as the configuration of nonlinear loads connected to a distribution feeder.
Abstract: A generalized procedure to obtain the probabilistic model of power system harmonic current injection and propagation is presented. The model takes into account random variations of both the operating modes as well as the configuration of nonlinear loads connected to a distribution feeder. The method is illustrated by an example of a bus having linear as well as nonlinear loads of different types. Potential applications of the model are also discussed.

Journal ArticleDOI
TL;DR: In this article, a chance-constrained stochastic programming model is developed for water quality optimization, which determines the least cost allocation of waste treatment plant biochemical oxygen demand (BOD) removal efficiencies, subject to probabilistic restrictions on maximum allowable instream dissolved oxygen deficit.
Abstract: A chance-constrained stochastic programming model is developed for water quality optimization. It determines the least cost allocation of waste treatment plant biochemical oxygen demand (BOD) removal efficiencies, subject to probabilistic restrictions on maximum allowable instream dissolved oxygen deficit. The new model extends well beyond traditional approaches that assume streamflow is the sole random variable. In addition to streamflow, other random variables in the model are initial in-stream BOD level and dissolved oxygen (DO) deficit; waste outfall flow rates, BOD levels and DO deficits; deoxygenation k1, reaeration k2, and sedimentation-scour rate k3 coefficients of the Streeter-Phelps DO sag model; photosynthetic input-benthic depletion rates Ai, and nonpoint source BOD input rate Pi for the Camp-Dobbins extensions to the Streeter-Phelps model. These random variables appear in more highly aggregated terms which in turn form part of the probabilistic constraints of the water quality optimization model. Stochastic simulation procedures for estimating the probability density functions and covariances of these aggregated terms are discussed. A new chance-constrained programming variant, imbedded chance constraints, is presented along with an example application. In effect, this method imbeds a chance constraint within a chance constraint in a manner which is loosely associated with the distribution-free method of chance-constrained programming. It permits the selection of nonexpected value realizations of the mean and variance estimates employed in the deterministic equivalents of traditional chance-constrained models. As well, it provides a convenient mechanism for generating constraint probability response surfaces. A joint chance-constrained formulation is also presented which illustrates the possibility for prescription of an overall system reliability level, rather than reach-by-reach reliability assignment.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the motion of a classical particle in a random isotropic potential arising from uniformly distributed scatterers in two dimensions and prove that in the weak coupling limit the velocity process of the particle converges in distribution to Brownian motion on a surface of constant speed.
Abstract: We consider the motion of a classical particle in a random isotropic potential arising from uniformly distributed scatterers in two dimensions. We prove that in the weak coupling limit the velocity process of the particle converges in distribution to Brownian motion on a surface of constant speed, i.e. on the circle. The resulting equation for the probability density of the particle is related to the Landau equation in plasmas.

Posted Content
TL;DR: In this article, the Limiting Cumulative Distribution and Probability Density Functions of the Least Squares Estimator in a First-Order Autoregressive Regression when the true model is near-integrated in the sense of Phillips (1986 A).
Abstract: We Tabulate the Limiting Cumulative Distribution and Probability Density Functions of the Least Squares Estimator in a First-Order Autoregressive Regression When the True Model Is Near-Integrated in the Sense of Phillips (1986 A). the Results Are Obtained Using an Exact Numerical Method Which Integrates the Appropriate Limiting Moment Generating Function. the Adequacy of the Approximation Is Examined by Monte Carlo Methods for Various First-Order Autogressive Processes with a Root Close to Unity.

Journal ArticleDOI
TL;DR: The fleet deployment problem for the one origin, one destination fixed-price contract requiring the transport of a given total amount of cargo within a given period is formulated and solved for the case that one or more cost components are given staircase functions of time.
Abstract: The fleet deployment problem for the one origin,one destination fixed-price contract requiring the transport of a given total amount of cargo within a given period is formulated and solved for the case that one or more cost components are given staircase functions of time A computer program has been developed to implement the solution of this problem The fleet deployment problem with one or more costs being random variables with known probability density functions is also formulated Analytical expressions for hte basic probabilistic quantities, ie the probability density function,the mean and the variance of the total operating cost, are presented Finally, sample results are presented and discussed and some extensions for further research are suggested

Journal ArticleDOI
TL;DR: The first use of the digital-image-analysis method to determine the probability distribution function of concentration in binary mixtures undergoing phase separation and polystyrene undergoing spinodal decomposition is reported.
Abstract: We report here the first use of the digital image analysis method to determine the probability distribution function of concentration in binary mixtures undergoing phase separation. This method has been applied to the mixture of poly(vinyl methyl ether) and polystyrene undergoing spinodal decomposition. The distribution function is initially Gaussian, but it becomes broader and turns bimodal upon decomposition. This transformation of distribution is related to the change of phase structure and, particularly, to the sharpening of the interfacial region.

Journal ArticleDOI
TL;DR: In this paper, two estimators of the residuals or innovations of a discrete time series are presented, based on a estimator which is root-T consistent with respect to a wide class of ǫ distributions, such as a Gaussian estimator.
Abstract: . A linear stationary and invertible process yt models the second-order properties of T observations on a discrete time series, up to finitely many unknown parameters θ. Two estimators of the residuals or innovations ɛt of yt are presented, based on a θ estimator which is root-T consistent with respect to a wide class of ɛt distributions, such as a Gaussian estimator. One sets unobserved yt equal to their mean, the other treats yt as a circulant and may be best computed via two passes of the fast Fourier transform. The convergence of both estimators to ɛt is investigated. We apply the estimated ɛt to estimate the probability density function of ɛt. Kernel density estimators are shown to converge uniformly in probability to the true density. A new sub-class of linear time series models is motivated.

Journal ArticleDOI
TL;DR: In this paper, a population of identical nonlinear oscillators, subject to random forces and coupled via a mean-field interaction, is studied in the thermodynamic limit, and the model presents a nonequilibrium phase transition from a stationary to a time-periodic probability density.
Abstract: A population of identical nonlinear oscillators, subject to random forces and coupled via a mean-field interaction, is studied in the thermodynamic limit. The model presents a nonequilibrium phase transition from a stationary to a time-periodic probability density. Below the transition line, the population of oscillators is in a quiescent state with order parameter equal to zero. Above the transition line, there is a state of collective rhythmicity characterized by a time-periodic behavior of the order parameter and all moments of the probability distribution. The information entropy of the ensemble is a constant both below and above the critical line. Analytical and numerical analyses of the model are provided.

Journal ArticleDOI
TL;DR: Existing methods of hydrologic network design are reviewed and a formulation based on Shannon’s information theory is presented, which involves the computation of joint entropy terms which can be computed by discretizinghydrologic time series data collected at station locations.
Abstract: Existing methods of hydrologic network design are reviewed and a formulation based on Shannon’s information theory is presented. This type of formulation involves the computation of joint entropy terms which can be computed by discretizing hydrologic time series data collected at station locations. The computation of discrete entropy terms is straightforward but in handling large numbers of stations enormous computation time and storage is required. In order to minimize these problems, bivariate and multivariate continuous distributions are used to derive entropy terms. The information transmission at bivariate level is derived for normal, lognormal, gamma, exponential, and extreme value distributions. At the multivariate level, multivariate form of normal and lognormal probability density functions are used.In order to illustrate the applicability of the derived information relationship for various bivariate and multivariate probability distributions, daily precipitation data for a period of two years co...

Journal ArticleDOI
TL;DR: A simple and rapid algorithm for numerically evaluating the probability density function (pdf) of intersymbol interference (ISI) in digital transmission systems and the biterror rate calculated from the pdf's of ISI and of Gaussian noise agree with upper and lower bounds as published by many authors.
Abstract: The paper presents a simple and rapid algorithm for numerically evaluating the probability density function (pdf) of intersymbol interference (ISI) in digital transmission systems. The results coincide with the analytical solutions available for very few cases only. The biterror rate calculated from the pdf's of ISI and of Gaussian noise agree with upper and lower bounds as published by many authors. Examples of the pdf of ISI in some cases of practical interest are given for various QAM-modulation schemes.

Journal ArticleDOI
TL;DR: In this article, the authors give a uniformity condition which controls variation of the function f across rays, which is somewhat more flexible than the usual regularity condition of monotonicity.

Journal ArticleDOI
TL;DR: In this paper, an analytical and adjustable model is proposed, by which the probability density of hourly global irradiance may be derived from long-term mean global irradiation, and it is tuned to a developmental data sample and verified against an independent data base.