scispace - formally typeset
Search or ask a question

Showing papers on "K-distribution published in 2001"


Journal ArticleDOI
TL;DR: In this article, the authors developed a model for the probability density function (pdf) of the irradiance fluctuations of an optical wave propagating through a turbulent medium, which is a two-parameter distribution that is based on a doubly stochastic theory of scintillation.
Abstract: We develop a model for the probability density function (pdf) of the irradiance fluctuations of an optical wave propagating through a turbulent medium. The model is a two-parameter distribution that is based on a doubly stochastic theory of scintillation that assumes that small-scale irradiance fluctuations are modulated by large-scale irradi- ance fluctuations of the propagating wave, both governed by indepen- dent gamma distributions. The resulting irradiance pdf takes the form of a generalized K distribution that we term the gamma-gamma distribution. The two parameters of the gamma-gamma pdf are determined using a recently published theory of scintillation, using only values of the refractive-index structure parameter C n (or Rytov variance) and inner scale l 0 provided with the simulation data. This enables us to directly calculate various log-irradiance moments that are necessary in the scaled plots. We make a number of comparisons with published plane wave and spherical wave simulation data over a wide range of turbu- lence conditions (weak to strong) that includes inner scale effects. The gamma-gamma pdf is found to generally provide a good fit to the simu- lation data in nearly all cases tested. © 2001 Society of Photo-Optical Instrumen-

1,033 citations


Journal ArticleDOI
TL;DR: A review of various bivariate gamma distribution models that are constructed from gamma marginals is presented in this paper, where the dependence of these models is directly or indirectly measured via the Pearson's productmoment correlation coefficient.

221 citations


Journal ArticleDOI
TL;DR: In this paper, a Bayesian density estimation method based upon mixtures of gamma distributions is proposed, using a reversible jump technique that allows us to move from one mixture size to another.
Abstract: This article proposes a Bayesian density estimation method based upon mixtures of gamma distributions. It considers both the cases of known mixture size, using a Gibbs sampling scheme with a Metropolis step, and unknown mixture size, using a reversible jump technique that allows us to move from one mixture size to another. We illustrate our methods using a number of simulated datasets, generated from distributions covering a wide range of cases: single distributions, mixtures of distributions with equal means and different variances, mixtures of distributions with different means and small variances and, finally, a distribution contaminated by low-weighted distributions with different means and equal, small variances. An application to estimation of some quantities for a M/G/1 queue is given, using real E-mail data from CNR-IAMI.

153 citations


Journal ArticleDOI
TL;DR: The Case I, Case II, and Case III distributions for the phase angle between two vectors perturbed by Gaussian noise are restated in terms of integrals with integrands containing only exponentials.
Abstract: The Case I, Case II, and Case III distributions for the phase angle between two vectors perturbed by Gaussian noise are restated in terms of integrals with integrands containing only exponentials. Added to these are a Case IV distribution in which one of the vectors is noise free; and Case V, Case VI, and Case VII distributions for the instantaneous radian frequency that results when the time between the two vectors goes to zero. In addition, existing and new forms for the probability density functions corresponding to each of the distributions are given. The results are applied to digital FM with a limiter/discriminator receiver to easily obtain the bit error probability without coding and the corresponding probability density function needed in a Chernoff bound approach to study the bit error probability with coding.

66 citations


01 Jan 2001
TL;DR: It is shown that a g -multiplicative approximation to the entropy can be obtained in Odn 1+z/g2 time for distributions with sufficiently high entropy where n is the size of the domain of the distribution and z is an arbitrarily small positive constant.
Abstract: We study the sample complexity of several basic statistical inference tasks as a function of the domain size for the underlying discrete probability distributions. Given access only to samples from two distributions over an n-element set, we want to distinguish identical pairs of distributions from pairs of distributions that have large statistical distance. We give an algorithm that uses O(n2/3 log n) independent samples from each distribution, runs in time linear in the sample size, makes no assumptions about the structure of the distributions, and distinguishes the case that the statistical distance between the distributions is small from the case that it is large. We also prove a lower bound of Ω(n2/3) for the sample complexity. Under a related model, we show how to test, given access to samples from a distribution over an n-element set, whether the distribution is statistically close to an explicitly specified distribution. Our test uses O(n1/2) samples, which matches the known tight bounds for the case when the explicit distribution is uniform. Given access to independent samples of a distribution A over the product space of two sets with n and m elements, respectively, we show how to test whether the distributions induced by A restricted to each component are independent, i.e., whether A is statistically close to A1 × A2 for some A1 over an n-element set and A2 over an m-element set. The sample complexity of our test is O(n 2/3m1/3), assuming without loss of generality that m ≤ n. We also give a matching lower bound up to polylogarithmic factors. We consider the problem of approximating the entropy of a black-box discrete distribution in sublinear time. We show that a g -multiplicative approximation to the entropy can be obtained in Odn 1+z/g2 time for distributions with sufficiently high entropy where n is the size of the domain of the distribution and z is an arbitrarily small positive constant. We show that one cannot get a multiplicative approximation to the entropy in general. Even for the class of distributions to which our upper bound applies, we show a lower bound of Wn1/2g2 .

43 citations


Journal ArticleDOI
01 Jul 2001
TL;DR: In this paper, the authors obtained general characterizations of probability distributions from relationships between failure rate and mean residual life from the original distribution and associated weighted distribution and extended particular results given by Gupta and Keating (1986), Jain et al. (1989) and Asadi (1998).
Abstract: In this paper we obtain general characterizations of probability distributions from relationships between failure rate and mean residual life from the original distribution and associated weighted distribution. Our characterization properties extend particular results given by Gupta and Keating (1986), Jain et al. (1989) and Asadi (1998). Using the theoretical results we obtain characterizations of some usual distributions.

41 citations


Proceedings ArticleDOI
04 Jul 2001
TL;DR: The statistics of log-compressed echo images is derived for a Nakagami distribution, more general than Rayleigh and with lower computational cost than K distribution, and used the extracted result for designing an unsharp masking filter to reduce speckle.
Abstract: Using a good statistical model of speckle formation is important in designing an adaptive filter for speckle reduction in ultrasound B-scan images. Most clinical ultrasound imaging systems use a nonlinear logarithmic function to reduce the dynamic range of the the input echo signal and emphasize objects with weak backscatter. Previously, the statistic of log-compressed images had been derived for Rayleigh and K distributions. In this paper, the statistics of log-compressed echo images is derived for a Nakagami distribution, more general than Rayleigh and with lower computational cost than K distribution, and used the extracted result for designing an unsharp masking filter to reduce speckle. To demonstrate the efficiency of the designed adaptive filter for removing speckle, we processed two original ultrasound images of kidney and liver.

31 citations



Book ChapterDOI
01 Jan 2001
TL;DR: In this paper, it was shown that in general, we need all closed convex classes of probability distributions, i.e., classes of all distributions which are located on a given interval.
Abstract: Traditionally, in science and engineering, measurement uncertainty is characterized by a probability distribution; however, often, we don’t know this probability distribution exactly, so we must consider classes of possible probability distributions. Interval computations deal with a very specific type of such classes: classes of all distributions which are located on a given interval. We show that in general, we need all closed convex classes of probability distributions.

11 citations


Proceedings ArticleDOI
01 Dec 2001
TL;DR: This paper describes a statistical method for the integration of an unlimited number of cues within a deformable model framework and presents a method for converting the resulting affine forms into the estimated Gaussian distributions of the generalized cue forces.
Abstract: In this paper we describe a statistical method for the integration of an unlimited number of cues within a deformable model framework. We treat each cue as a random variable, each of which is the sum of a large number of local contributions with unknown probability distribution functions. Under the assumption that these distributions are independent, the overall distributions of the generalized cue forces can be approximated with multidimensional Gaussians, as per the central limit theorem. Estimating the covariance matrix of these Gaussian distributions, however, is difficult, because the probability distributions of the local contributions are unknown. We use affine arithmetic as a novel approach toward overcoming these difficulties. It lets us track and integrate the support of bounded distributions without having to know their actual probability distributions, and without having to make assumptions about their properties. We present a method for converting the resulting affine forms into the estimated Gaussian distributions of the generalized cue forces. This method scales well with the number of cues. We apply a Kalman filter as a maximum likelihood estimator to merge all Gaussian estimates of the cues into a single best fit Gaussian. Its mean is the deterministic result of the algorithm, and its covariance matrix provides a measure of the confidence in the result. We demonstrate in experiments how to apply this framework to improve the results of a face tracking system.

9 citations


Journal ArticleDOI
TL;DR: In this article, a Poisson process was used to derive the theoretical distribution of the length distributions of the Strahler streams and a digital elevation model (DEM) was adopted to calculate the stream lengths of four basins in Taiwan.
Abstract: One of the basic tasks in geomorphologic analysis is to know the probability distributions of the stream lengths of different orders. In practical applications, this information is useful for basin rainfall-runoff modelling. The objective of this study is to determine the length distributions of the Strahler streams. A Poisson process was used to derive the theoretical distributions. The result showed that the length distribution of the first-order stream is an exponential distribution and the second-order or higher order stream length is a gamma distribution. In order to verify the theoretical distributions, a digital elevation model (DEM) was adopted to calculate the stream lengths of four basins in Taiwan. Kolmogorov-Smirnov and chi-square tests were used to test the goodness-of-fit of the data. Results showed that the length distributions of the first- and second-order streams analysed by using DEM correspond with those from the derived distribution method.

Journal ArticleDOI
TL;DR: In this article, the inverse of the partition function in 1D Ising model, as a function of the external field, is a product of Fourier transforms of compound geometric distributions, which are random sums (randomly stopped random walks) with the probability of a success depending only on the interaction constant K between sites.

Journal ArticleDOI
TL;DR: In this paper, the problem of ordering probability distributions which take the values in a set having only an ordering is considered, and some fundamental proper-ties required of such procedures are introduced.
Abstract: We consider the problem of ordering probability distributions which take the values in a set having only an ordering. Using the cumulative distribution function we are able to introduce some fundamental proper-ties required of such procedures. We then consider ordering procedures that are based on the mapping of a probability distribution into a single value. Finally we consider an ordering procedure that involves a pairwise comparison of the probability distributions


Proceedings Article
01 Jan 2001
TL;DR: A rapid output probability calculation method in HMM based large vocabulary continuous speech recognition systems (LVCSRS) based on time-skipping of calculation, clustering of probability density distributions, and pruning of calculation is proposed.
Abstract: In this paper, we propose a rapid output probability calculation method in HMM based large vocabulary continuous speech recognition systems (LVCSRS). This method is based on time-skipping of calculation, clustering of probability density distributions, and pruning of calculation. Only distributions covering input feature vectors with high probabilities are used to calculate output probabilities strictly, and representative distributions for other distributions are used to calculate them approximately. Here a skipping method for likelihood calculation is adopted in the time domain. Using the rapid calculation method by clustering of probability density distributions, the recognition time in a LVCSRS system was reduced by about 40%. Using a pruning method of likelihood calculations on the way, it was further reduced by 25 %. Finally, using time-skipping, the calculation time, furthermore, was reduced by 15% without compromising recognition accuracy.

Journal ArticleDOI
TL;DR: In this article, moment inequalities, error bounds, and classification probability for a general class of distributions were obtained for the class of weighted distributions, and the results were used to compare experiments for weighted distributions.
Abstract: We obtain stochastic inequalities, error bounds, and classification probability for a general class of distributions. We introduce the notion of variability ordering via the probability functional and comparisons made for the weighted and the original distribu- tions. We present moment inequalities, comparisons, and applications. 2000 Mathematics Subject Classification. 62N05, 62B10. 1. Introduction. Weighted distributions are of tremendous practical importance in various aspects of reliability, biometry, survival analysis and renewal theory to men- tion a few areas. In renewal theory the residual lifetime has a limiting distribution that is a weighted distribution with the weight function equal to the reciprocal of the hazard (failure) rate function. When observations are selected with probability proportional to their "length" the resulting distribution is referred to as a length- biased distribution. Length-biased distributions occur naturally in a wide variety of settings and are discussed by several authors including but not limited to Gupta and Akman (4), Zelen and Feinleib (6). The problem of providing error bounds for ex- ponential approximations to classes of life distributions in particular, the class of weighted distributions is addressed in this paper. Keilson (5) suggested a measure of departure from exponentiality within the class of completely monotone distribu- tions (mixture of exponential distributions). These measures of departure are given in terms of ρ =| µ2/2µ 2 − 1|, where µ2 = E(X 2 ) and µ = E(X). This is due to the fact that the exponential distribution satisfies ρ = 0. Brown (2) obtained bounds for the class of increasing mean residual life (IMRL) functions. The main objective of this paper is to obtain inequalities for weighted reliability measures, partial order via probability functional and compare reliability measures for weighted and in particular length-biased distributions. This paper is organized as follows. Section 2 contains some basic definitions, utility notions and comparisons. In Section 3 we present some moment inequalities for weighted reliability measures. In Section 4 some partial ordering via the probability functional are presented. The results are used to compare experiments for weighted distributions. Section 5 is con- cerned with comparisons for weighted and related distributions. The results are ap- plied to length-biased mixtures of distributions.

Journal ArticleDOI
TL;DR: The k distribution for the exponential band model has been known for some time but requires intensive computation, here a new expression is given that can be evaluated rapidly, and example calculations for water vapor are presented.
Abstract: The k distribution for the exponential band model has been known for some time but requires intensive computation. Here a new expression is given that can be evaluated rapidly, and example calculations for water vapor are presented.


Journal ArticleDOI
TL;DR: In this article, a practical trial to choose the permissible order of series expansion, especially by introducing a pre-established tolerance range for the deviation of the theoretical cumulative probability from the experimental one, is described.
Abstract: In the actual sound environment, the objective system often exhibits the non-Gaussian property, owing to physical, social and psychological factors. Many kinds of probability distributions have been proposed from various viewpoints to describe the probabilistic property of complicated random phenomena. Sometimes, a probability distribution of orthogonal series expansion type, which takes the well-known probability distribution (e.g., Gaussian, binomial distributions, etc.) as the first term and reflects the lower and higher order statistics in the expansion terms is also utilized. This paper describes a practical trial to choose the permissible order of series expansion, especially by introducing a pre-established tolerance range for the deviation of the theoretical cumulative probability from the experimental one. The minimal number of expansion terms can be rationally determined so as to satisfy the tolerance range. Finally, the effectiveness of this method is experimentally confirmed by applying it to the environmental noise.

Journal ArticleDOI
TL;DR: In this paper, the 4-parameter extended generalized gamma distribution function is proposed to find a property of the flood data using the more general probability distribution function which includes many types of probability distributions used for hydrological statistics.
Abstract: In hydrologic frequency analysis, a variety of probability distribution functions, such as the gamma, log-normal, extreme-value, and log-gamma, are often examined whether they are appropriate to the flood data or not by using some hypothesis testing methods, because the properties of the flood data are not well understood even now. In order to find a property of the flood data, we can use another method: using the more general probability distribution function which includes many types of probability distributions used for hydrological statistics. The 4-parameter extended generalized gamma distribution function is one of such distribution functions. Two types of the extended distributions are proposed. The parameter estimation method is also introduced.