scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations

01 Dec 1952-Annals of Mathematical Statistics (Institute of Mathematical Statistics)-Vol. 23, Iss: 4, pp 493-507
TL;DR: In this paper, it was shown that the likelihood ratio test for fixed sample size can be reduced to this form, and that for large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample with the second test.
Abstract: In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \sum^n_{j=1} X_j \leqq k,$ where $X_1, X_2, \cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\rho$. If $\rho_1$ and $\rho_2$ are the indices corresponding to two alternative tests $e = \log \rho_1/\log \rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\rho$ by assuming that $X$ is normally distributed.
Citations
More filters
Journal ArticleDOI
TL;DR: This work provides a finite-key security analysis for QKD which is valid against arbitrary information leakage from the state preparation process of the legitimate users, and evaluates the security of a leaky decoy-state BB84 protocol with biased basis choice.
Abstract: Security proofs of quantum key distribution (QKD) typically assume that the devices of the legitimate users are perfectly shielded from the eavesdropper. This assumption is, however, very hard to meet in practice, and thus the security of current QKD implementations is not guaranteed. Here, we fill this gap by providing a finite-key security analysis for QKD which is valid against arbitrary information leakage from the state preparation process of the legitimate users. For this, we extend the techniques introduced by Tamaki et al (2016 New J. Phys. 18 065008) to the finite-key regime, and we evaluate the security of a leaky decoy-state BB84 protocol with biased basis choice, which is one of the most implemented QKD schemes today. Our simulation results demonstrate the practicability of QKD over long distances and within a reasonable time frame given that the legitimate users' devices are sufficiently isolated.

28 citations

Journal ArticleDOI
TL;DR: The existence of sparse pseudorandom distributions which are not only sparse, but also have the property that no polynomial‐time algorithm may find an element in their support, except for a negligible probability, are proved independently of any intractability assumption.
Abstract: The existence of sparse pseudorandom distributions is proved. These are probability distributions concentrated in a very small set of strings, yet it is infeasible for any polynomial‐time algorithm to distinguish between truly random coins and coins selected according to these distributions. It is shown that such distributions can be generated by (nonpolynomial) probabilistic algorithms, while probabilistic polynomial‐time algorithms cannot even approximate all the pseudorandom distributions. Moreover, we show the existence of evasive pseudorandom distributions which are not only sparse, but also have the property that no polynomial‐time algorithm may find an element in their support, except for a negligible probability. All these results are proved independently of any intractability assumption. © 1992 Wiley Periodicals, Inc.

28 citations

Posted Content
TL;DR: This work shows that the statistics of data from syndrome measurements can be used to do the following: estimation of parameters of an error channel, including the ability correct away the invertible part of the error channel once it is estimated.
Abstract: Syndrome measurements made in quantum error correction contain more information than is typically used. We show that the statistics of data from syndrome measurements can be used to do the following: (i) estimation of parameters of an error channel, including the ability correct away the invertible part of the error channel, once it is estimated; (ii) hypothesis testing (or model selection) to distinguish error channels, e.g., to determine if the errors are correlated. The unifying theme is to make use of all of the information in the statistics of the data collected from syndrome measurements using machine learning and control algorithms.

28 citations


Cites background from "A Measure of Asymptotic Efficiency ..."

  • ...The Chernoff bound [22, 23] for a binomial distribution can be used to upper bound R....

    [...]

Proceedings ArticleDOI
01 Jun 2017
TL;DR: The main result gives the optimal expected length of a lossless compressor when the community signal is strong enough, a condition on the edge probabilities and the data distributions, which can take place below the exact recovery threshold of the SBM.
Abstract: This paper investigates the fundamental limits for compressing data on graphs, exploiting dependencies due to community structures in the graph. The source model, referred to as the data block model (DBM), is a mixture of discrete memoryless sources determined by the community structure of a stochastic block model (SBM). The main result gives the optimal expected length of a lossless compressor when the community signal is strong enough, a condition on the edge probabilities and the data distributions, which can take place below the exact recovery threshold of the SBM. This is derived in part by obtaining the threshold for exact recovery in SBMs with strong side information, a result of independent interest, which extends the CH-divergence threshold. Finally we discuss compressing data with almost exact recovery algorithms.

28 citations


Additional excerpts

  • ...[16] Define the Chernoff information C(P1‖P0) between P1 and P0 as C(P1‖P0) min λ∈[0,1] max{D(Pλ‖P1), D(Pλ‖P0)} (8) = D(Pλ∗‖P1) = D(Pλ∗‖P0), (9) where the tilted distribution Pλ is defined as...

    [...]

Journal ArticleDOI
F.B. Shepherd1, Adrian Vetta
TL;DR: It is proved that in multiwavelength multifiber transparent networks the cost of transparency all but disappears if there is moderate traffic load, which suggests that the cost savings from using wavelength converters is significant only in young networks with relatively few fibers lit.
Abstract: We consider the problem of network design in transparent, or clear channel, optical networks associated with wavelength-division multiplexing (WDM). We focus on the class of traffic engineering models known as routing, wavelength, and capacity assignment problems. Here, in contrast to traditional networks, traffic flow paths must also be assigned an end-to-end wavelength. This additional requirement means that there can be an increased cost associated with optimal capacity allocations for such WDM-flows. In general, this can be arbitrarily worse than traditional network designs. We argue that in order to evaluate the benefit of different switch technologies, a good benchmark is to measure the increase in costs purely in terms of link capacity, we call this the cost of transparency. Experimental research shows that this cost is small in multifiber networks with modest switching functionality at the nodes. We present theoretical justification for why this occurs, and prove that in multiwavelength multifiber transparent networks the cost of transparency all but disappears if there is moderate traffic load. Our arguments are based on efficient heuristics that may also be useful for more complex network optimizations. This suggests that the cost savings from using wavelength converters is significant only in young networks with relatively few fibers lit. Such savings may, thus, be small relative to the initial capital expense involved in installing wavelength conversion.

28 citations


Cites methods from "A Measure of Asymptotic Efficiency ..."

  • ...To do this we will apply Chernoff’s Theorem; note that we are in the special case in which , for all and ....

    [...]

  • ...Then, we obtain our result using Chernoff bounds....

    [...]

  • ...The following theorem can be obtained via techniques due to Chernoff [5]....

    [...]

References
More filters