A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
TLDR
In this paper, it was shown that the likelihood ratio test for fixed sample size can be reduced to this form, and that for large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample with the second test.Abstract:
In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \sum^n_{j=1} X_j \leqq k,$ where $X_1, X_2, \cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\rho$. If $\rho_1$ and $\rho_2$ are the indices corresponding to two alternative tests $e = \log \rho_1/\log \rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\rho$ by assuming that $X$ is normally distributed.read more
Citations
More filters
Journal ArticleDOI
Dealing with Uncertainty: A Survey of Theories and Practices
Yiping Li,Jianwen Chen,Ling Feng +2 more
TL;DR: It is hoped that this study could provide insights to the database community on how uncertainty is managed in other disciplines, and further challenge and inspire database researchers to develop more advanced data management techniques and tools to cope with a variety of uncertainty issues in the real world.
Journal ArticleDOI
Extremal properties of likelihood-ratio quantizers
TL;DR: Optimality properties of likelihood-ratio quantizers are established for a very broad class of quantization problems, including problems involving the maximization of an Ali-Silvey (1966) distance measure and the Neyman-Pearson variant of the decentralized detection problem.
Journal ArticleDOI
Learning cost-sensitive active classifiers
TL;DR: In this paper, an active classifier is used to obtain the values of additional attributes and the penalty incurred if the classifier outputs the wrong classification, which can be useful when deciding whether to gather information relevant to a medical procedure or experiment.
Journal ArticleDOI
Randomized methods for design of uncertain systems
TL;DR: The sample complexity of various constrained control problems is derived, showing the key role played by the binomial distribution and related tail inequalities, and providing the sample complexity which guarantees that the solutions obtained with SPV algorithms meet some pre-specified probabilistic accuracy and confidence.
Book
Information and Entropy Econometrics - A Review and Synthesis
TL;DR: A detailed survey of information-theoretic concepts and quantities used within econometrics can be found in this article with a focus on the interconnection between information theory, estimation, and inference.
References
More filters