scispace - formally typeset
Open AccessJournal ArticleDOI

A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations

Herman Chernoff
- 01 Dec 1952 - 
- Vol. 23, Iss: 4, pp 493-507
Reads0
Chats0
TLDR
In this paper, it was shown that the likelihood ratio test for fixed sample size can be reduced to this form, and that for large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample with the second test.
Abstract
In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \sum^n_{j=1} X_j \leqq k,$ where $X_1, X_2, \cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\rho$. If $\rho_1$ and $\rho_2$ are the indices corresponding to two alternative tests $e = \log \rho_1/\log \rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\rho$ by assuming that $X$ is normally distributed.

read more

Citations
More filters
Journal ArticleDOI

Lautum Information

TL;DR: This work investigates an alternative measure of dependence: the lautum information defined as the divergence between the product-of-marginal and joint distributions, i.e., swapping the arguments in the definition of mutual information.
Posted Content

Commitment Capacity of Discrete Memoryless Channels

TL;DR: This work introduces and solves the problem of characterising the optimal rate at which a discrete memoryless channel can be used to for bit commitment, and provides a lower bound on the channel’s capacity for implementing coin tossing.
Book

Lecture notes on bucket algorithms

Luc Devroye
TL;DR: The connection between the expected time of various bucket algorithms and the dis- tribution of the data is explained and results are Illustrated on standard searching, sorting and selection problems, as well as on a variety of problems In computational geometry and operations research.
Proceedings ArticleDOI

A theory of wormhole routing in parallel computers

TL;DR: Simulation results suggest that the idea of random initial delays is not only useful for theoretical analysis but may actually improve the performance of wormhole routing algorithms.
Proceedings ArticleDOI

Reliable clustering on uncertain graphs

TL;DR: This paper examines the problem of clustering uncertain graphs with the use of a possible worlds model in which the most reliable clusters are discovered in the presence of uncertainty.
References
More filters