scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Designing quantum experiments with a genetic algorithm

29 Oct 2019-Vol. 4, Iss: 4, pp 045012
TL;DR: In this paper, a genetic algorithm was proposed to find quantum states with a large quantum Fisher information (QFI), which can be seen as Schrodinger-cat-like states.
Abstract: We introduce a genetic algorithm that designs quantum optics experiments for engineering quantum states with specific properties. Our algorithm is powerful and flexible, and can easily be modified to find methods of engineering states for a range of applications. Here we focus on quantum metrology. First, we consider the noise-free case, and use the algorithm to find quantum states with a large quantum Fisher information (QFI). We find methods, which only involve experimental elements that are available with current or near-future technology, for engineering quantum states with up to a 100-fold improvement over the best classical state, and a 20-fold improvement over the optimal Gaussian state. Such states are a superposition of the vacuum with a large number of photons (around 80), and can hence be seen as Schrodinger-cat-like states. We then apply the two most dominant noise sources in our setting -- photon loss and imperfect heralding -- and use the algorithm to find quantum states that still improve over the optimal Gaussian state with realistic levels of noise. This will open up experimental and technological work in using exotic non-Gaussian states for quantum-enhanced phase measurements. Finally, we use the Bayesian mean square error to look beyond the regime of validity of the QFI, finding quantum states with precision enhancements over the alternatives even when the experiment operates in the regime of limited data.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: A novel quantum autoencoder is developed that successfully denoises Greenberger-Horne-Zeilinger, W, Dicke, and cluster states subject to spin-flip errors and random unitary noise.
Abstract: Entangled states are an important resource for quantum computation, communication, metrology, and the simulation of many-body systems. However, noise limits the experimental preparation of such states. Classical data can be efficiently denoised by autoencoders-neural networks trained in unsupervised manner. We develop a novel quantum autoencoder that successfully denoises Greenberger-Horne-Zeilinger, W, Dicke, and cluster states subject to spin-flip errors and random unitary noise. Various emergent quantum technologies could benefit from the proposed unsupervised quantum neural networks.

90 citations

Journal ArticleDOI
TL;DR: A Bayesian multi-parameter quantum bound is derived, the optimal measurement when the authors' bound can be saturated for a single shot is constructed, and experiments involving a repeated sequence of these measurements are considered.
Abstract: A longstanding problem in quantum metrology is how to extract as much information as possible in realistic scenarios with not only multiple unknown parameters, but also limited measurement data and some degree of prior information. Here we present a practical solution to this: We derive a Bayesian multi-parameter quantum bound, construct the optimal measurement when our bound can be saturated for a single shot, and consider experiments involving a repeated sequence of these measurements. Our method properly accounts for the number of measurements and the degree of prior information, and we illustrate our ideas with a qubit sensing network and a model for phase imaging, clarifying the nonasymptotic role of local and global schemes. Crucially, our technique is a powerful way of implementing quantum protocols in a wide range of practical scenarios that tools such as the Helstrom and Holevo Cram\'er-Rao bounds cannot normally access.

57 citations

Journal ArticleDOI
TL;DR: A scheme of quantum reservoir state preparation, based on a quantum neural network framework, which takes classical optical excitation as input and provides desired quantum states as output and can be used as a compact quantum state preparation device for emerging quantum technologies.
Abstract: We develop a scheme of quantum reservoir state preparation, based on a quantum neural network framework, which takes classical optical excitation as input and provides desired quantum states as output. We theoretically demonstrate the broad potential of our scheme by explicitly preparing a range of intriguing quantum states, including single-photon states, Schr\"odinger's cat states, and two-mode entangled states. This scheme can be used as a compact quantum state preparation device for emerging quantum technologies.

55 citations

Journal ArticleDOI
23 Feb 2020
TL;DR: Examining computer-inspired designs in quantum physics that led to laboratory experiments and inspired new scientific insights is examined.
Abstract: The design of new devices and experiments has historically relied on the intuition of human experts. Now, design inspirations from computers are increasingly augmenting the capability of scientists. We briefly overview different fields of physics that rely on computer-inspired designs using a variety of computational approaches based on topological optimization, evolutionary strategies, deep learning, reinforcement learning or automated reasoning. Then we focus specifically on quantum physics. When designing new quantum experiments, there are two challenges: quantum phenomena are unintuitive, and the number of possible configurations of quantum experiments explodes exponentially. These challenges can be overcome by using computer-designed quantum experiments. We focus on the most mature and practical approaches to find new complex quantum experiments, which have subsequently been realized in the lab. These methods rely on a highly efficient topological search, which can inspire new scientific ideas. We review several extensions and alternatives based on various optimization and machine learning techniques. Finally, we discuss what can be learned from the different approaches and outline several future directions. Designing new experiments in physics is a challenge for humans; therefore, computers have become a tool to expand scientists’ capabilities and to provide creative solutions. This Perspective article examines computer-inspired designs in quantum physics that led to laboratory experiments and inspired new scientific insights.

43 citations

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the most mature and practical approaches that scientists used to find new complex quantum experiments, which experimentalists subsequently have realized in the laboratories, and discuss what they can learn from different approaches in the fields of physics, and raise several fascinating possibilities for future research.
Abstract: The design of new devices and experiments in science and engineering has historically relied on the intuitions of human experts. This credo, however, has changed. In many disciplines, computer-inspired design processes, also known as inverse-design, have augmented the capability of scientists. Here we visit different fields of physics in which computer-inspired designs are applied. We will meet vastly diverse computational approaches based on topological optimization, evolutionary strategies, deep learning, reinforcement learning or automated reasoning. Then we draw our attention specifically on quantum physics. In the quest for designing new quantum experiments, we face two challenges: First, quantum phenomena are unintuitive. Second, the number of possible configurations of quantum experiments explodes combinatorially. To overcome these challenges, physicists began to use algorithms for computer-designed quantum experiments. We focus on the most mature and \textit{practical} approaches that scientists used to find new complex quantum experiments, which experimentalists subsequently have realized in the laboratories. The underlying idea is a highly-efficient topological search, which allows for scientific interpretability. In that way, some of the computer-designs have led to the discovery of new scientific concepts and ideas -- demonstrating how computer algorithm can genuinely contribute to science by providing unexpected inspirations. We discuss several extensions and alternatives based on optimization and machine learning techniques, with the potential of accelerating the discovery of practical computer-inspired experiments or concepts in the future. Finally, we discuss what we can learn from the different approaches in the fields of physics, and raise several fascinating possibilities for future research.

43 citations

References
More filters
BookDOI
TL;DR: In this article, a survey of elementary applications of probability theory can be found, including the following: 1. Plausible reasoning 2. The quantitative rules 3. Elementary sampling theory 4. Elementary hypothesis testing 5. Queer uses for probability theory 6. Elementary parameter estimation 7. The central, Gaussian or normal distribution 8. Sufficiency, ancillarity, and all that 9. Repetitive experiments, probability and frequency 10. Advanced applications: 11. Discrete prior probabilities, the entropy principle 12. Simple applications of decision theory 15.
Abstract: Foreword Preface Part I. Principles and Elementary Applications: 1. Plausible reasoning 2. The quantitative rules 3. Elementary sampling theory 4. Elementary hypothesis testing 5. Queer uses for probability theory 6. Elementary parameter estimation 7. The central, Gaussian or normal distribution 8. Sufficiency, ancillarity, and all that 9. Repetitive experiments, probability and frequency 10. Physics of 'random experiments' Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle 12. Ignorance priors and transformation groups 13. Decision theory: historical background 14. Simple applications of decision theory 15. Paradoxes of probability theory 16. Orthodox methods: historical background 17. Principles and pathology of orthodox statistics 18. The Ap distribution and rule of succession 19. Physical measurements 20. Model comparison 21. Outliers and robustness 22. Introduction to communication theory References Appendix A. Other approaches to probability theory Appendix B. Mathematical formalities and style Appendix C. Convolutions and cumulants.

4,641 citations

Book
01 Jan 1969
TL;DR: In this article, the optimum procedure for choosing between two hypotheses, and an approximate procedure valid at small signal-to-noise ratios and called threshold detection, are presented, and a quantum counterpart of the Cramer-Rao inequality of conventional statistics sets a lower bound to the mean-square errors of such estimates.
Abstract: A review. Quantum detection theory is a reformulation, in quantum-mechanical terms, of statistical decision theory as applied to the detection of signals in random noise. Density operators take the place of the probability density functions of conventional statistics. The optimum procedure for choosing between two hypotheses, and an approximate procedure valid at small signal-to-noise ratios and called threshold detection, are presented. Quantum estimation theory seeks best estimators of parameters of a density operator. A quantum counterpart of the Cramer-Rao inequality of conventional statistics sets a lower bound to the mean-square errors of such estimates. Applications at present are primarily to the detection and estimation of signals of optical frequencies in the presence of thermal radiation.

3,931 citations

Journal ArticleDOI
TL;DR: Type-II noncollinear phase matching in parametric down conversion produces true entanglement: No part of the wave function must be discarded, in contrast to previous schemes.
Abstract: We report on a high-intensity source of polarization-entangled photon pairs with high momentum definition. Type-II noncollinear phase matching in parametric down conversion produces true entanglement: No part of the wave function must be discarded, in contrast to previous schemes. With two-photon fringe visibilities in excess of 97%, we demonstrated a violation of Bell's inequality by over 100 standard deviations in less than 5 min. The new source allowed ready preparation of all four of the EPR-Bell states.

2,639 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed a new technique, the squeezed-state technique, that allows one to decrease the photon-counting error while increasing the radiation pressure error, or vice versa.
Abstract: The interferometers now being developed to detect gravitational waves work by measuring the relative positions of widely separated masses. Two fundamental sources of quantum-mechanical noise determine the sensitivity of such an interferometer: (i) fluctuations in number of output photons (photon-counting error) and (ii) fluctuations in radiation pressure on the masses (radiation-pressure error). Because of the low power of available continuous-wave lasers, the sensitivity of currently planned interferometers will be limited by photon-counting error. This paper presents an analysis of the two types of quantum-mechanical noise, and it proposes a new technique---the "squeezed-state" technique---that allows one to decrease the photon-counting error while increasing the radiation-pressure error, or vice versa. The key requirement of the squeezed-state technique is that the state of the light entering the interferometer's normally unused input port must be not the vacuum, as in a standard interferometer, but rather a "squeezed state"---a state whose uncertainties in the two quadrature phases are unequal. Squeezed states can be generated by a variety of nonlinear optical processes, including degenerate parametric amplification.

2,582 citations

Journal ArticleDOI
TL;DR: In this article, the authors reviewed the original theory and its improvements, and a few examples of experimental two-qubit gates are given, and the use of realistic components, the errors they induce in the computation, and how these errors can be corrected is discussed.
Abstract: Linear optics with photon counting is a prominent candidate for practical quantum computing. The protocol by Knill, Laflamme, and Milburn [2001, Nature (London) 409, 46] explicitly demonstrates that efficient scalable quantum computing with single photons, linear optical elements, and projective measurements is possible. Subsequently, several improvements on this protocol have started to bridge the gap between theoretical scalability and practical implementation. The original theory and its improvements are reviewed, and a few examples of experimental two-qubit gates are given. The use of realistic components, the errors they induce in the computation, and how these errors can be corrected is discussed.

2,483 citations