scispace - formally typeset
Topic

Cumulative distribution function

About: Cumulative distribution function is a(n) research topic. Over the lifetime, 6049 publication(s) have been published within this topic receiving 145696 citation(s). The topic is also known as: CDF & distribution function.
Papers
More filters

Journal ArticleDOI
Abstract: Let $x$ and $y$ be two random variables with continuous cumulative distribution functions $f$ and $g$. A statistic $U$ depending on the relative ranks of the $x$'s and $y$'s is proposed for testing the hypothesis $f = g$. Wilcoxon proposed an equivalent test in the Biometrics Bulletin, December, 1945, but gave only a few points of the distribution of his statistic. Under the hypothesis $f = g$ the probability of obtaining a given $U$ in a sample of $n x's$ and $m y's$ is the solution of a certain recurrence relation involving $n$ and $m$. Using this recurrence relation tables have been computed giving the probability of $U$ for samples up to $n = m = 8$. At this point the distribution is almost normal. From the recurrence relation explicit expressions for the mean, variance, and fourth moment are obtained. The 2rth moment is shown to have a certain form which enabled us to prove that the limit distribution is normal if $m, n$ go to infinity in any arbitrary manner. The test is shown to be consistent with respect to the class of alternatives $f(x) > g(x)$ for every $x$.

9,469 citations


Journal ArticleDOI
Abstract: : Given a sequence of independent identically distributed random variables with a common probability density function, the problem of the estimation of a probability density function and of determining the mode of a probability function are discussed. Only estimates which are consistent and asymptotically normal are constructed. (Author)

9,261 citations


Journal ArticleDOI
Abstract: The test is based on the maximum difference between an empirical and a hypothetical cumulative distribution. Percentage points are tabled, and a lower bound to the power function is charted. Confidence limits for a cumulative distribution are described. Examples are given. Indications that the test is superior to the chi-square test are cited.

4,410 citations


Journal ArticleDOI
01 Apr 1967-
TL;DR: By using Shannon's sampling formula, the problem of the detection of a deterministic signal in white Gaussian noise, by means of an energy-measuring device, reduces to the consideration of the sum of the squares of statistically independent Gaussian variates.
Abstract: By using Shannon's sampling formula, the problem of the detection of a deterministic signal in white Gaussian noise, by means of an energy-measuring device, reduces to the consideration of the sum of the squares of statistically independent Gaussian variates. When the signal is absent, the decision statistic has a central chi-square distribution with the number of degrees of freedom equal to twice the time-bandwidth product of the input. When the signal is present, the decision statistic has a noncentral chi-square distribution with the same number of degrees of freedom and a noncentrality parameter λ equal to the ratio of signal energy to two-sided noise spectral density. Since the noncentral chi-square distribution has not been tabulated extensively enough for our purpose, an approximate form was used. This form replaces the noncentral chi-square with a modified chi-square whose degrees of freedom and threshold are determined by the noncentrality parameter and the previous degrees of freedom. Sets of receiver operating characteristic (ROC) curves are drawn for several time-bandwidth products, as well as an extended nomogram of the chi-square cumulative probability which can be used for rapid calculation of false alarm and detection probabilities. Related work in energy detection by J. I. Marcum and E. L Kaplan is discussed.

2,972 citations


Journal ArticleDOI
TL;DR: It is shown analytically that the maximal rate achievable with error probability ¿ isclosely approximated by C - ¿(V/n) Q-1(¿) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion, and Q is the complementary Gaussian cumulative distribution function.
Abstract: This paper investigates the maximal channel coding rate achievable at a given blocklength and error probability. For general classes of channels new achievability and converse bounds are given, which are tighter than existing bounds for wide ranges of parameters of interest, and lead to tight approximations of the maximal achievable rate for blocklengths n as short as 100. It is also shown analytically that the maximal rate achievable with error probability ? isclosely approximated by C - ?(V/n) Q-1(?) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion , and Q is the complementary Gaussian cumulative distribution function.

2,408 citations


Network Information
Related Topics (5)
Probability density function

22.3K papers, 422.8K citations

89% related
Moment-generating function

3.1K papers, 86.6K citations

87% related
Joint probability distribution

11.2K papers, 343.3K citations

87% related
Probability distribution

40.9K papers, 1.1M citations

87% related
Random variable

29.1K papers, 674.6K citations

86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20224
2021274
2020353
2019272
2018344
2017324

Top Attributes

Show by:

Topic's top 5 most impactful authors

Mohamed-Slim Alouini

72 papers, 3.3K citations

Mihajlo Stefanovic

47 papers, 476 citations

Saralees Nadarajah

29 papers, 269 citations

Chintha Tellambura

25 papers, 778 citations

George K. Karagiannidis

19 papers, 998 citations