scispace - formally typeset
Search or ask a question
Author

Marius Krumm

Bio: Marius Krumm is an academic researcher from Austrian Academy of Sciences. The author has contributed to research in topics: Quantum & Quantum process. The author has an hindex of 4, co-authored 12 publications receiving 142 citations. Previous affiliations of Marius Krumm include University of Western Ontario & Heidelberg University.

Papers
More filters
Journal ArticleDOI
TL;DR: This work studies how compatibility with thermodynamics constrains the structure of quantum theory by studying how self-duality and analogues of projective measurements, subspaces and eigenvalues imply important aspects ofquantum theory.
Abstract: Despite its enormous empirical success, the formalism of quantum theory still raises fundamental questions: why is nature described in terms of complex Hilbert spaces, and what modifications of it could we reasonably expect to find in some regimes of physics? Here we address these questions by studying how compatibility with thermodynamics constrains the structure of quantum theory. We employ two postulates that any probabilistic theory with reasonable thermodynamic behavior should arguably satisfy. In the framework of generalized probabilistic theories, we show that these postulates already imply important aspects of quantum theory, like self-duality and analogues of projective measurements, subspaces and eigenvalues. However, they may still admit a class of theories beyond quantum mechanics. Using a thought experiment by von Neumann, we show that these theories admit a consistent thermodynamic notion of entropy, and prove that the second law holds for projective measurements and mixing procedures. Furthermore, we study additional entropy-like quantities based on measurement probabilities and convex decomposition probabilities, and uncover a relation between one of these quantities and Sorkin's notion of higher-order interference.

49 citations

Journal ArticleDOI
TL;DR: In this article, the spectral entropy is defined as the entropy of the probability of a fine-grained measurement, which is a function of the spectral decomposition of density matrices.
Abstract: In this note we lay some groundwork for the resource theory of thermodynamics in general probabilistic theories (GPTs). We consider theories satisfying a purely convex abstraction of the spectral decomposition of density matrices: that every state has a decomposition, with unique probabilities, into perfectly distinguishable pure states. The spectral entropy, and analogues using other Schur-concave functions, can be defined as the entropy of these probabilities. We describe additional conditions under which the outcome probabilities of a fine-grained measurement are majorized by those for a spectral measurement, and therefore the "spectral entropy" is the measurement entropy (and therefore concave). These conditions are (1) projectivity, which abstracts aspects of the Lueders-von Neumann projection postulate in quantum theory, in particular that every face of the state space is the positive part of the image of a certain kind of projection operator called a filter; and (2) symmetry of transition probabilities. The conjunction of these, as shown earlier by Araki, is equivalent to a strong geometric property of the unnormalized state cone known as perfection: that there is an inner product according to which every face of the cone, including the cone itself, is self-dual. Using some assumptions about the thermodynamic cost of certain processes that are partially motivated by our postulates, especially projectivity, we extend von Neumann's argument that the thermodynamic entropy of a quantum system is its spectral entropy to generalized probabilistic systems satisfying spectrality.

44 citations

Journal ArticleDOI
TL;DR: Krumm et al. as mentioned in this paper presented an analysis of Krumm's work in the Perimeter Institute for Theoretical Physics at the University of Western Ontario, London, Ontario, Canada.
Abstract: Marius Krumm,1, 2 Howard Barnum,3, 4 Jonathan Barrett,5 and Markus P. Muller1, 6, 7, 2 Department of Applied Mathematics, University of Western Ontario, London, ON N6A 5BY, Canada Department of Theoretical Physics, University of Heidelberg, Heidelberg, Germany Department of Physics and Astronomy, University of New Mexico, Albuquerque, NM, USA Institute for Theoretical Physics and Riemann Center for Geometry and Physics, Leibniz University Hannover, Appelstrasse 2, 30163 Hannover, Germany Department of Computer Science, University of Oxford, Oxford, UK Department of Philosophy, University of Western Ontario, London, ON N6A 5BY, Canada Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5, Canada (Dated: August 15, 2016)

43 citations

Posted Content
TL;DR: In this article, the connection between quantum theory, thermodynamics and information theory is investigated, mathematically described by the framework of generalized probabilistic theories, and a thought experiment by von Neumann is adapted to obtain a natural thermodynamic entropy definition.
Abstract: This thesis investigates the connection between quantum theory, thermodynamics and information theory. Theories with structure similar to that of quantum theory are considered, mathematically described by the framework of "Generalized Probabilistic Theories". For these theories, a thought experiment by von Neumann is adapted to obtain a natural thermodynamic entropy definition, following a proposal by J. Barrett. Mathematical properties of this entropy are compared to physical consequences of the thought experiment. The validity of the second law of thermodynamics is investigated. In that context, observables and projective measurements are generalized to prove an entropy increase for projective measurements of ensembles. Information-theoretically motivated definitions of the entropy are compared to the entropy from the thermodynamic thought experiment. The conditions for the thermodynamic entropy to be well-defined are considered in greater detail. Several further properties of the theories under consideration (e.g. whether there is higher order interference, Pfister's state discrimination principle) and their relation to entropy are investigated.

11 citations

Journal ArticleDOI
TL;DR: A "Bell witness" is constructed that certifies nonlocality with fewer measurements than possible without such spatiotemporal symmetries, suggesting a new class of semi-device-independent protocols for quantum technologies.
Abstract: Nonlocality, as demonstrated by the violation of Bell inequalities, enables device-independent cryptographic tasks that do not require users to trust their apparatus. In this article, we consider devices whose inputs are spatiotemporal degrees of freedom, e.g. orientations or time durations. Without assuming the validity of quantum theory, we prove that the devices' statistical response must respect their input's symmetries, with profound foundational and technological implications. We exactly characterize the bipartite binary quantum correlations in terms of local symmetries, indicating a fundamental relation between spacetime and quantum theory. For Bell experiments characterized by two input angles, we show that the correlations are accounted for by a local hidden variable model if they contain enough noise, but conversely must be nonlocal if they are pure enough. This allows us to construct a "Bell witness" that certifies nonlocality with fewer measurements than possible without such spatiotemporal symmetries, suggesting a new class of semi-device-independent protocols for quantum technologies.

6 citations


Cited by
More filters
Journal Article
TL;DR: In this article, the general quantum information processing (QIP) in quantum wires is explained and the development of nonlinear and quantum massive optics is introduced, and a model of electronic beats and quantum wires are shown.
Abstract: The general quantum information processing (QIP) in quantum wires is explained. The development of nonlinear and quantum massive optics is introduced. A model of electronic beats and quantum wires is shown. The design of quantum bits, entangle states and the related theory, as well as an experiment to test Bell′s inequality are introduced in details. A method to study QIP in quantum wires is proposed.

107 citations

Journal ArticleDOI
TL;DR: This work proposes a resource theory of quantum thermodynamics without a background temperature, so that no states at all come for free, and applies this resource theory to the case of many non-interacting systems, and shows that states are classified by their entropy and average energy.
Abstract: Several recent results on thermodynamics have been obtained using the tools of quantum information theory and resource theories. So far, the resource theories utilized to describe thermodynamics have assumed the existence of an infinite thermal reservoir, by declaring that thermal states at some background temperature come for free. Here, we propose a resource theory of quantum thermodynamics without a background temperature, so that no states at all come for free. We apply this resource theory to the case of many noninteracting systems and show that all quantum states are classified by their entropy and average energy, even arbitrarily far away from equilibrium. This implies that thermodynamics takes place in a two-dimensional convex set that we call the energy-entropy diagram. The answers to many resource-theoretic questions about thermodynamics can be read off from this diagram, such as the efficiency of a heat engine consisting of finite reservoirs, or the rate of conversion between two states. This allows us to consider a resource theory which puts work and heat on an equal footing, and serves as a model for other resource theories.

87 citations

Journal ArticleDOI
TL;DR: In this paper, the authors show how a given set of observations can be manifested in an operational theory, and characterize consistency conditions limiting the range of possible extensions of the operational theory.
Abstract: The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out non-classical features of single systems can equivalently result from higher dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.

72 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that the essential machinery of quantum computation, such as reversible controlled transformations and phase kick-back mechanism, can be found in any operational-defined theory with a consistent notion of information.
Abstract: The advent of quantum computing has challenged classical conceptions of which problems are efficiently solvable in our physical world. This motivates the general study of how physical principles bound computational power. In this paper we show that some of the essential machinery of quantum computation—namely reversible controlled transformations and the phase kick-back mechanism—exist in any operational-defined theory with a consistent notion of information. These results provide the tools for an exploration of the physics underpinning the structure of computational algorithms. We investigate the relationship between interference behaviour and computational power, demonstrating that non-trivial interference behaviour is a general resource for post-classical computation. In proving the above, we connect higher-order interference to the existence of post-quantum particle types, potentially providing a novel experimental test for higher-order interference. Finally, we conjecture that theories with post-quantum interference—the higher-order interference of Sorkin—can solve problems intractable even on a quantum computer.

62 citations

Journal ArticleDOI
TL;DR: This work studies how compatibility with thermodynamics constrains the structure of quantum theory by studying how self-duality and analogues of projective measurements, subspaces and eigenvalues imply important aspects ofquantum theory.
Abstract: Despite its enormous empirical success, the formalism of quantum theory still raises fundamental questions: why is nature described in terms of complex Hilbert spaces, and what modifications of it could we reasonably expect to find in some regimes of physics? Here we address these questions by studying how compatibility with thermodynamics constrains the structure of quantum theory. We employ two postulates that any probabilistic theory with reasonable thermodynamic behavior should arguably satisfy. In the framework of generalized probabilistic theories, we show that these postulates already imply important aspects of quantum theory, like self-duality and analogues of projective measurements, subspaces and eigenvalues. However, they may still admit a class of theories beyond quantum mechanics. Using a thought experiment by von Neumann, we show that these theories admit a consistent thermodynamic notion of entropy, and prove that the second law holds for projective measurements and mixing procedures. Furthermore, we study additional entropy-like quantities based on measurement probabilities and convex decomposition probabilities, and uncover a relation between one of these quantities and Sorkin's notion of higher-order interference.

49 citations