Cumulative distribution function
About: Cumulative distribution function is a(n) research topic. Over the lifetime, 6049 publication(s) have been published within this topic receiving 145696 citation(s). The topic is also known as: CDF & distribution function.
Papers published on a yearly basis
Abstract: Let $x$ and $y$ be two random variables with continuous cumulative distribution functions $f$ and $g$. A statistic $U$ depending on the relative ranks of the $x$'s and $y$'s is proposed for testing the hypothesis $f = g$. Wilcoxon proposed an equivalent test in the Biometrics Bulletin, December, 1945, but gave only a few points of the distribution of his statistic. Under the hypothesis $f = g$ the probability of obtaining a given $U$ in a sample of $n x's$ and $m y's$ is the solution of a certain recurrence relation involving $n$ and $m$. Using this recurrence relation tables have been computed giving the probability of $U$ for samples up to $n = m = 8$. At this point the distribution is almost normal. From the recurrence relation explicit expressions for the mean, variance, and fourth moment are obtained. The 2rth moment is shown to have a certain form which enabled us to prove that the limit distribution is normal if $m, n$ go to infinity in any arbitrary manner. The test is shown to be consistent with respect to the class of alternatives $f(x) > g(x)$ for every $x$.
Abstract: : Given a sequence of independent identically distributed random variables with a common probability density function, the problem of the estimation of a probability density function and of determining the mode of a probability function are discussed. Only estimates which are consistent and asymptotically normal are constructed. (Author)
Abstract: The test is based on the maximum difference between an empirical and a hypothetical cumulative distribution. Percentage points are tabled, and a lower bound to the power function is charted. Confidence limits for a cumulative distribution are described. Examples are given. Indications that the test is superior to the chi-square test are cited.
01 Apr 1967
TL;DR: By using Shannon's sampling formula, the problem of the detection of a deterministic signal in white Gaussian noise, by means of an energy-measuring device, reduces to the consideration of the sum of the squares of statistically independent Gaussian variates.
Abstract: By using Shannon's sampling formula, the problem of the detection of a deterministic signal in white Gaussian noise, by means of an energy-measuring device, reduces to the consideration of the sum of the squares of statistically independent Gaussian variates. When the signal is absent, the decision statistic has a central chi-square distribution with the number of degrees of freedom equal to twice the time-bandwidth product of the input. When the signal is present, the decision statistic has a noncentral chi-square distribution with the same number of degrees of freedom and a noncentrality parameter λ equal to the ratio of signal energy to two-sided noise spectral density. Since the noncentral chi-square distribution has not been tabulated extensively enough for our purpose, an approximate form was used. This form replaces the noncentral chi-square with a modified chi-square whose degrees of freedom and threshold are determined by the noncentrality parameter and the previous degrees of freedom. Sets of receiver operating characteristic (ROC) curves are drawn for several time-bandwidth products, as well as an extended nomogram of the chi-square cumulative probability which can be used for rapid calculation of false alarm and detection probabilities. Related work in energy detection by J. I. Marcum and E. L Kaplan is discussed.
TL;DR: It is shown analytically that the maximal rate achievable with error probability ¿ isclosely approximated by C - ¿(V/n) Q-1(¿) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion, and Q is the complementary Gaussian cumulative distribution function.
Abstract: This paper investigates the maximal channel coding rate achievable at a given blocklength and error probability. For general classes of channels new achievability and converse bounds are given, which are tighter than existing bounds for wide ranges of parameters of interest, and lead to tight approximations of the maximal achievable rate for blocklengths n as short as 100. It is also shown analytically that the maximal rate achievable with error probability ? isclosely approximated by C - ?(V/n) Q-1(?) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion , and Q is the complementary Gaussian cumulative distribution function.