A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
Reads0
Chats0
TLDR
In this paper, it was shown that the likelihood ratio test for fixed sample size can be reduced to this form, and that for large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample with the second test.Abstract:
In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \sum^n_{j=1} X_j \leqq k,$ where $X_1, X_2, \cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\rho$. If $\rho_1$ and $\rho_2$ are the indices corresponding to two alternative tests $e = \log \rho_1/\log \rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\rho$ by assuming that $X$ is normally distributed.read more
Citations
More filters
Journal ArticleDOI
Bit error outage for diversity reception in shadowing environment
TL;DR: This letter addresses the problem of evaluating the bit error outage (BEO), i.e., the outage probability defined in terms of bit error probability, in a Rayleigh fading and shadowing environment and considers coherent detection of binary phase-shift keying with maximal ratio combining (MRC).
Parallel Tree Contraction Part 1: Fundamentals.
Gary L. Miller,John H. Reif +1 more
TL;DR: This paper introduces parallel tree contraction: a new bottom-up technique for constructing parallel algorithms on trees that only require O(1ogn) time and O(n/logn) processors on a Q-sided randomized PRAM or O( n) processor on a deterministic PRAM.
Journal ArticleDOI
Parallel hashing: an efficient implementation of shared memory
Anna R. Karlin,Eli Upfal +1 more
TL;DR: A probabilistic scheme for implementing shared memory on a bounded-degree network of processors that enables n processors to store and retrieve an arbitrary set of n data items in O(logn) parallel steps is presented.
Journal ArticleDOI
On the rate of convergence of empirical measures in ∞-transportation distance
TL;DR: In this article, the authors consider random i.i.d. samples of continuous measures on bounded connected domains and prove an upper bound on the distance between the measure and the empirical measure of the sample.
Journal ArticleDOI
The Cauchy–Schwarz divergence and Parzen windowing: Connections to graph theory and Mercer kernels
TL;DR: This paper contributes a tutorial level discussion of some interesting properties of the recent Cauchy–Schwarz divergence measure between probability density functions, which brings together elements from several different machine learning fields.
References
More filters