scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations

01 Dec 1952-Annals of Mathematical Statistics (Institute of Mathematical Statistics)-Vol. 23, Iss: 4, pp 493-507
TL;DR: In this paper, it was shown that the likelihood ratio test for fixed sample size can be reduced to this form, and that for large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample with the second test.
Abstract: In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \sum^n_{j=1} X_j \leqq k,$ where $X_1, X_2, \cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\rho$. If $\rho_1$ and $\rho_2$ are the indices corresponding to two alternative tests $e = \log \rho_1/\log \rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\rho$ by assuming that $X$ is normally distributed.
Citations
More filters
Book ChapterDOI
16 Feb 1989
TL;DR: It is shown that for some positive constant c it is not feasible to approximate Independent Set (for graphs of n nodes) within a factor of n c , provided Maximum 2-Satisfiability does not have a randomized polynomial time approximation scheme.
Abstract: We show that for some positive constant c it is not feasible to approximate Independent Set (for graphs of n nodes) within a factor of n c , provided Maximum 2-Satisfiability does not have a randomized polynomial time approximation scheme. We also study reductions preserving the quality of approximations and exhibit complete problems.

36 citations

Journal ArticleDOI
TL;DR: In this article, both errors of the tests that the law of a sample of size n belongs to Θ, go to zero like exp[−αn] and exp[ − βn] under reasonable conditions on Θ.
Abstract: Under reasonable conditions on Θ, both errors of the tests that the law of a sample of size n belongs to Θ, go to zero like exp[−αn] and exp[− βn]. We shall determine the best possible values for α and β and give a construction of sequences of tests for which the errors decrease with such an optimal rate.

36 citations

Proceedings ArticleDOI
04 Jan 1995
TL;DR: The method that is used to construct efficient packet routing schedules is based on the algorithmic form of the Lova/spl acute/sz local lemma discovered by Beck (1991), and it is shown how to parallelize the algorithm so that it runs in NC.
Abstract: Leighton, Maggs and Rao (1988) showed that for any network and any set of packets whose paths through the network are fixed and edge-simple, there exists a schedule for routing the packets to their destinations in O(c+d) steps using constant-size queues, where c is the congestion of the paths in the network, and d is the length of the longest path (the dilation). The proof, however, used the Lova/spl acute/sz (1975) local lemma and was not constructive. In this paper, we show how to find such a schedule in O(NE+Elog/sup /spl epsiv//E) time, for any fixed /spl epsiv/>0, where N is the total number of packets, and E is the number of edges in the network. We also show how to parallelize the algorithm so that it runs in NC. The method that we use to construct efficient packet routing schedules is based on the algorithmic form of the Lova/spl acute/sz local lemma discovered by Beck (1991). >

36 citations

Proceedings Article
09 Jul 2016
TL;DR: Both the direct and indirect mechanisms achieve the reliability target in a dominant-strategy equilibrium, select a small number of agents to prepare, and do so at low cost and with much lower variance in payments than the spot auction.
Abstract: We study the problem of incentivizing reliable demand-response in modern electricity grids. Each agent is uncertain about her future ability to reduce demand and unreliable. Agents who choose to participate in a demand-response scheme may be paid when they respond and penalized otherwise. The goal is to reliably achieve a demand reduction target while selecting a minimal set of agents from those willing to participate. We design incentive-aligned, direct and indirect mechanisms. The direct mechanism elicits both response probabilities and costs, while the indirect mechanism elicits willingness to accept a penalty in the case of non-response. We benchmark against a spot auction, in which demand reduction is purchased from agents when needed. Both the direct and indirect mechanisms achieve the reliability target in a dominant-strategy equilibrium, select a small number of agents to prepare, and do so at low cost and with much lower variance in payments than the spot auction.

36 citations


Cites methods from "A Measure of Asymptotic Efficiency ..."

  • ...Hence, we adopt the Fourier Transform approach in the experiments reported here.3 3The Chernoff bound [Chernoff, 1952] and other large deviation bounds can also be used to approximate this expression, and would...

    [...]

References
More filters