scispace - formally typeset
Search or ask a question
Book

Quantum detection and estimation theory

TL;DR: In this article, the optimum procedure for choosing between two hypotheses, and an approximate procedure valid at small signal-to-noise ratios and called threshold detection, are presented, and a quantum counterpart of the Cramer-Rao inequality of conventional statistics sets a lower bound to the mean-square errors of such estimates.
Abstract: A review. Quantum detection theory is a reformulation, in quantum-mechanical terms, of statistical decision theory as applied to the detection of signals in random noise. Density operators take the place of the probability density functions of conventional statistics. The optimum procedure for choosing between two hypotheses, and an approximate procedure valid at small signal-to-noise ratios and called threshold detection, are presented. Quantum estimation theory seeks best estimators of parameters of a density operator. A quantum counterpart of the Cramer-Rao inequality of conventional statistics sets a lower bound to the mean-square errors of such estimates. Applications at present are primarily to the detection and estimation of signals of optical frequencies in the presence of thermal radiation.
Citations
More filters
Journal ArticleDOI
16 Mar 2000-Nature
TL;DR: In information processing, as in physics, the classical world view provides an incomplete approximation to an underlying quantum reality that can be harnessed to break codes, create unbreakable codes, and speed up otherwise intractable computations.
Abstract: In information processing, as in physics, our classical world view provides an incomplete approximation to an underlying quantum reality. Quantum effects like interference and entanglement play no direct role in conventional information processing, but they can--in principle now, but probably eventually in practice--be harnessed to break codes, create unbreakable codes, and speed up otherwise intractable computations.

3,080 citations

Journal ArticleDOI
TL;DR: Quantum metrology is the use of quantum techniques such as entanglement to yield higher statistical precision than purely classical approaches as discussed by the authors, where the central limit theorem implies that the reduction is proportional to the square root of the number of repetitions.
Abstract: The statistical error in any estimation can be reduced by repeating the measurement and averaging the results. The central limit theorem implies that the reduction is proportional to the square root of the number of repetitions. Quantum metrology is the use of quantum techniques such as entanglement to yield higher statistical precision than purely classical approaches. In this Review, we analyse some of the most promising recent developments of this research field and point out some of the new experiments. We then look at one of the major new trends of the field: analyses of the effects of noise and experimental imperfections.

2,977 citations

Journal ArticleDOI
TL;DR: Essential theoretical tools that have been developed to assess the security of the main experimental platforms are presented (discrete- variable, continuous-variable, and distributed-phase-reference protocols).
Abstract: Quantum key distribution (QKD) is the first quantum information task to reach the level of mature technology, already fit for commercialization. It aims at the creation of a secret key between authorized partners connected by a quantum channel and a classical authenticated channel. The security of the key can in principle be guaranteed without putting any restriction on an eavesdropper's power. This article provides a concise up-to-date review of QKD, biased toward the practical side. Essential theoretical tools that have been developed to assess the security of the main experimental platforms are presented (discrete-variable, continuous-variable, and distributed-phase-reference protocols).

2,926 citations


Cites background from "Quantum detection and estimation th..."

  • ...the mutual information depends both on Eve’s states and on the best measurement that Eve can perform to discriminate them, which can be constructed only for veryspecific examples of the set of states (Helstrom, 1976). c. General(orcoherent)attacks. Eve’s most generalstrategy includes so many possible variations (she may entangle several systems flying from Alice to Bob, she may modify her attack according to the r...

    [...]

Journal ArticleDOI
TL;DR: This review focuses on continuous-variable quantum information processes that rely on any combination of Gaussian states, Gaussian operations, and Gaussian measurements, including quantum communication, quantum cryptography, quantum computation, quantum teleportation, and quantum state and channel discrimination.
Abstract: The science of quantum information has arisen over the last two decades centered on the manipulation of individual quanta of information, known as quantum bits or qubits. Quantum computers, quantum cryptography, and quantum teleportation are among the most celebrated ideas that have emerged from this new field. It was realized later on that using continuous-variable quantum information carriers, instead of qubits, constitutes an extremely powerful alternative approach to quantum information processing. This review focuses on continuous-variable quantum information processes that rely on any combination of Gaussian states, Gaussian operations, and Gaussian measurements. Interestingly, such a restriction to the Gaussian realm comes with various benefits, since on the theoretical side, simple analytical tools are available and, on the experimental side, optical components effecting Gaussian processes are readily available in the laboratory. Yet, Gaussian quantum information processing opens the way to a wide variety of tasks and applications, including quantum communication, quantum cryptography, quantum computation, quantum teleportation, and quantum state and channel discrimination. This review reports on the state of the art in this field, ranging from the basic theoretical tools and landmark experimental realizations to the most recent successful developments.

2,781 citations

Journal ArticleDOI
TL;DR: Any pure or mixed entangled state of two systems can be produced by two classically communicating separated observers, drawing on a supply of singlets as their sole source of entanglement.
Abstract: If two separated observers are supplied with entanglement, in the form of n pairs of particles in identical partly entangled pure states, one member of each pair being given to each observer, they can, by local actions of each observer, concentrate this entanglement into a smaller number of maximally entangled pairs of particles, for example, Einstein-Podolsky-Rosen singlets, similarly shared between the two observers. The concentration process asymptotically conserves entropy of entanglement---the von Neumann entropy of the partial density matrix seen by either observer---with the yield of singlets approaching, for large n, the base-2 entropy of entanglement of the initial partly entangled pure state. Conversely, any pure or mixed entangled state of two systems can be produced by two classically communicating separated observers, drawing on a supply of singlets as their sole source of entanglement. \textcopyright{} 1996 The American Physical Society.

2,633 citations


Cites background from "Quantum detection and estimation th..."

  • ...Ψ) is the original pure state’s entanglement and H = − P j pj log2 pj is the Shannon entropy of the measurement outcomes. All local treatments (eg generalized or positive operator valued measurements [6]) that Alice might apply can be cast in this form, if necessary by considering her operations to be performed in an appropriately enlarged Hilbert space. In particular, unitary transformations by Alic...

    [...]

  • ...eatment does not correspond to any von Neumann measurement in the original 2-dimensional spin space, but rather to a two-outcome generalized measurement or POVM (positive-operator-valued measurement) [6,7]. If the particle is not absorbed or deflected, its residual state after this treatment will be a maximally mixed state of spin up and spin down. Now suppose Alice tells Bob the result of her generaliz...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, the photon statistics of arbitrary fields in fully quantum-mechanical terms are discussed, and a general method of representing the density operator for the field is discussed as well as a simple formulation of a superposition law for photon fields.
Abstract: Methods are developed for discussing the photon statistics of arbitrary fields in fully quantum-mechanical terms. In order to keep the classical limit of quantum electrodynamics plainly in view, extensive use is made of the coherent states of the field. These states, which reduce the field correlation functions to factorized forms, are shown to offer a convenient basis for the description of fields of all types. Although they are not orthogonal to one another, the coherent states form a complete set. It is shown that any quantum state of the field may be expanded in terms of them in a unique way. Expansions are also developed for arbitrary operators in terms of products of the coherent state vectors. These expansions are discussed as a general method of representing the density operator for the field. A particular form is exhibited for the density operator which makes it possible to carry out many quantum-mechanical calculations by methods resembling those of classical theory. This representation permits clear insights into the essential distinction between the quantum and classical descriptions of the field. It leads, in addition, to a simple formulation of a superposition law for photon fields. Detailed discussions are given of the incoherent fields which are generated by superposing the outputs of many stationary sources. These fields are all shown to have intimately related properties, some of which have been known for the particular case of blackbody radiation.

5,372 citations

Book
01 Jan 1968
TL;DR: Detection, estimation, and modulation theory, Detection, estimation and modulation theorists, اطلاعات رسانی کشاورزی .
Abstract: Detection, estimation, and modulation theory , Detection, estimation, and modulation theory , مرکز فناوری اطلاعات و اطلاع رسانی کشاورزی

3,908 citations

Book
01 Jan 1955

2,083 citations

Journal ArticleDOI
TL;DR: The problem of testing statistical hypotheses is an old one as discussed by the authors, and its origin is usually connected with the name of Thomas Bayes, who gave the well-known theorem on the probabilities a posteriori of the possible causes of a given event.
Abstract: The problem of testing statistical hypotheses is an old one. Its origin is usually connected with the name of Thomas Bayes, who gave the well-known theorem on the probabilities a posteriori of the possible “causes\" of a given event. Since then it has been discussed by many writers of whom we shall here mention two only, Bertrand and Borel, whose differing views serve well to illustrate the point from which we shall approach the subject. Bertrand put into statistical form a variety of hypotheses, as for example the hypothesis that a given group of stars with relatively small angular distances between them as seen from the earth, form a “system” or group in space. His method of attack, which is that in common use, consisted essentially in calculating the probability, P, that a certain character, x , of the observed facts would arise if the hypothesis tested were true. If P were very small, this would generally be considered as an indication that the hypothesis, H, was probably false, and vice versa . Bertrand expressed the pessimistic view that no test of this kind could give reliable results. Borel, however, in a later discussion, considered that the method described could be applied with success provided that the character, x , of the observed facts were properly chosen—were, in fact, a character which he terms “en quelque sorte remarquable.”

1,552 citations