scispace - formally typeset

Topic

Cumulative distribution function

About: Cumulative distribution function is a(n) research topic. Over the lifetime, 6049 publication(s) have been published within this topic receiving 145696 citation(s). The topic is also known as: CDF & distribution function.


Papers
More filters
Journal ArticleDOI
Abstract: Let $x$ and $y$ be two random variables with continuous cumulative distribution functions $f$ and $g$. A statistic $U$ depending on the relative ranks of the $x$'s and $y$'s is proposed for testing the hypothesis $f = g$. Wilcoxon proposed an equivalent test in the Biometrics Bulletin, December, 1945, but gave only a few points of the distribution of his statistic. Under the hypothesis $f = g$ the probability of obtaining a given $U$ in a sample of $n x's$ and $m y's$ is the solution of a certain recurrence relation involving $n$ and $m$. Using this recurrence relation tables have been computed giving the probability of $U$ for samples up to $n = m = 8$. At this point the distribution is almost normal. From the recurrence relation explicit expressions for the mean, variance, and fourth moment are obtained. The 2rth moment is shown to have a certain form which enabled us to prove that the limit distribution is normal if $m, n$ go to infinity in any arbitrary manner. The test is shown to be consistent with respect to the class of alternatives $f(x) > g(x)$ for every $x$.

9,469 citations

Journal ArticleDOI
Abstract: : Given a sequence of independent identically distributed random variables with a common probability density function, the problem of the estimation of a probability density function and of determining the mode of a probability function are discussed. Only estimates which are consistent and asymptotically normal are constructed. (Author)

9,261 citations

Journal ArticleDOI
Abstract: The test is based on the maximum difference between an empirical and a hypothetical cumulative distribution. Percentage points are tabled, and a lower bound to the power function is charted. Confidence limits for a cumulative distribution are described. Examples are given. Indications that the test is superior to the chi-square test are cited.

4,410 citations

Journal ArticleDOI
01 Apr 1967
TL;DR: By using Shannon's sampling formula, the problem of the detection of a deterministic signal in white Gaussian noise, by means of an energy-measuring device, reduces to the consideration of the sum of the squares of statistically independent Gaussian variates.
Abstract: By using Shannon's sampling formula, the problem of the detection of a deterministic signal in white Gaussian noise, by means of an energy-measuring device, reduces to the consideration of the sum of the squares of statistically independent Gaussian variates. When the signal is absent, the decision statistic has a central chi-square distribution with the number of degrees of freedom equal to twice the time-bandwidth product of the input. When the signal is present, the decision statistic has a noncentral chi-square distribution with the same number of degrees of freedom and a noncentrality parameter λ equal to the ratio of signal energy to two-sided noise spectral density. Since the noncentral chi-square distribution has not been tabulated extensively enough for our purpose, an approximate form was used. This form replaces the noncentral chi-square with a modified chi-square whose degrees of freedom and threshold are determined by the noncentrality parameter and the previous degrees of freedom. Sets of receiver operating characteristic (ROC) curves are drawn for several time-bandwidth products, as well as an extended nomogram of the chi-square cumulative probability which can be used for rapid calculation of false alarm and detection probabilities. Related work in energy detection by J. I. Marcum and E. L Kaplan is discussed.

2,972 citations

Journal ArticleDOI
TL;DR: It is shown analytically that the maximal rate achievable with error probability ¿ isclosely approximated by C - ¿(V/n) Q-1(¿) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion, and Q is the complementary Gaussian cumulative distribution function.
Abstract: This paper investigates the maximal channel coding rate achievable at a given blocklength and error probability. For general classes of channels new achievability and converse bounds are given, which are tighter than existing bounds for wide ranges of parameters of interest, and lead to tight approximations of the maximal achievable rate for blocklengths n as short as 100. It is also shown analytically that the maximal rate achievable with error probability ? isclosely approximated by C - ?(V/n) Q-1(?) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion , and Q is the complementary Gaussian cumulative distribution function.

2,408 citations

Network Information
Related Topics (5)
Estimator

97.3K papers, 2.6M citations

84% related
Optimization problem

96.4K papers, 2.1M citations

79% related
Cluster analysis

146.5K papers, 2.9M citations

77% related
Matrix (mathematics)

105.5K papers, 1.9M citations

77% related
Node (networking)

158.3K papers, 1.7M citations

77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20224
2021274
2020353
2019272
2018344
2017324