scispace - formally typeset
Search or ask a question

Showing papers in "Foundations of Physics in 2014"


Journal ArticleDOI
TL;DR: In this article, the authors argue that the case for "gravitizing" quantum theory is at least as strong as that for quantizing gravity, and that the principles of general relativity must influence, and actually change, the very formalism of quantum mechanics.
Abstract: This paper argues that the case for “gravitizing” quantum theory is at least as strong as that for quantizing gravity. Accordingly, the principles of general relativity must influence, and actually change, the very formalism of quantum mechanics. Most particularly, an “Einsteinian”, rather than a “Newtonian” treatment of the gravitational field should be adopted, in a quantum system, in order that the principle of equivalence be fully respected. This leads to an expectation that quantum superpositions of states involving a significant mass displacement should have a finite lifetime, in accordance with a proposal previously put forward by Diosi and the author.

254 citations


Journal ArticleDOI
TL;DR: In this paper, a class of non-locally modified gravity models were proposed to explain the current phase of cosmic acceleration without dark energy, deriving causal and conserved field equations, adjusting the model to make it support a given expansion history.
Abstract: I review a class of nonlocally modified gravity models which were proposed to explain the current phase of cosmic acceleration without dark energy. Among the topics considered are deriving causal and conserved field equations, adjusting the model to make it support a given expansion history, why these models do not require an elaborate screening mechanism to evade solar system tests, degrees of freedom and kinetic stability, and the negative verdict of structure formation. Although these simple models are not consistent with data on the growth of cosmic structures many of their features are likely to carry over to more complicated models which are in better agreement with the data.

150 citations


Journal ArticleDOI
TL;DR: In this paper, it is shown that the outcome of a measurement depends deterministically on the ontic state of the system being measured if and only if the measurement is sharp.
Abstract: In order to claim that one has experimentally tested whether a noncontextual ontological model could underlie certain measurement statistics in quantum theory, it is necessary to have a notion of noncontextuality that applies to unsharp measurements, i.e., those that can only be represented by positive operator-valued measures rather than projection-valued measures. This is because any realistic measurement necessarily has some nonvanishing amount of noise and therefore never achieves the ideal of sharpness. Assuming a generalized notion of noncontextuality that applies to arbitrary experimental procedures, it is shown that the outcome of a measurement depends deterministically on the ontic state of the system being measured if and only if the measurement is sharp. Hence for every unsharp measurement, its outcome necessarily has an indeterministic dependence on the ontic state. We defend this proposal against alternatives. In particular, we demonstrate why considerations parallel to Fine’s theorem do not challenge this conclusion.

75 citations


Journal ArticleDOI
TL;DR: In this article, it is argued that the essential physics of the Hawking process for black holes can be modelled in other physical systems, and thus the physics of both is the same.
Abstract: It is argued that Hawking radiation has indeed been measured and shown to posses a thermal spectrum, as predicted. This contention is based on three separate legs. The first is that the essential physics of the Hawking process for black holes can be modelled in other physical systems. The second is the white hole horizons are the time inverse of black hole horizons, and thus the physics of both is the same. The third is that the quantum emission, which is the Hawking process, is completely determined by measurements of the classical parameters of a linear physical system. The experiment conducted in 2010 fulfils all of these requirements, and is thus a true measurement of Hawking radiation.

44 citations


Journal ArticleDOI
TL;DR: The 2nd Law of thermodynamics was driven by the Big Bang being extraordinary special, with hugely suppressed gravitational degrees of freedom as mentioned in this paper, which cannot have been simply the result of a conventional quantum gravity.
Abstract: The 2nd Law of thermodynamics was driven by the Big Bang being extraordinary special, with hugely suppressed gravitational degrees of freedom. This cannot have been simply the result of a conventional quantum gravity. Conformal cyclic cosmology proposes a different picture, of a classical evolution from an aeon preceding our own. The ultimate Hawking evaporation of black holes is key to the 2nd Law and requires information loss, violating unitarity in a strongly gravitational context.

42 citations


Journal ArticleDOI
TL;DR: In this article, a number of advantages of objective collapse theories for the resolution of long-standing problems in cosmology and quantum gravity are discussed. And the authors show how reduction models contain the necessary tools to provide solutions for these issues.
Abstract: We display a number of advantages of objective collapse theories for the resolution of long-standing problems in cosmology and quantum gravity. In particular, we examine applications of objective reduction models to three important issues: the origin of the seeds of cosmic structure, the problem of time in quantum gravity and the information loss paradox; we show how reduction models contain the necessary tools to provide solutions for these issues. We wrap up with an adventurous proposal, which relates the spontaneous collapse events of objective collapse models to microscopic virtual black holes.

41 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that the classical finite-dimensional complex quantum mechanics with superselection rules is the only non-signaling probabilistic theory in which individual systems are Jordan algebras (equivalently, their cones of unnormalized states are homogeneous and self-dual), composites are locally tomographic (meaning that states are determined by the joint probabilities they assign to measurement outcomes on the component systems) and at least one system has the structure of a qubit.
Abstract: Using a result of H. Hanche-Olsen, we show that (subject to fairly natural constraints on what constitutes a system, and on what constitutes a composite system), orthodox finite-dimensional complex quantum mechanics with superselection rules is the only non-signaling probabilistic theory in which (i) individual systems are Jordan algebras (equivalently, their cones of unnormalized states are homogeneous and self-dual), (ii) composites are locally tomographic (meaning that states are determined by the joint probabilities they assign to measurement outcomes on the component systems) and (iii) at least one system has the structure of a qubit. Using this result, we also characterize finite dimensional quantum theory among probabilistic theories having the structure of a dagger-monoidal category.

41 citations


Journal ArticleDOI
TL;DR: In this paper, it is shown how all the major conceptual difficulties of standard (textbook) quantum mechanics, including the two measurement problems and the (supposed) nonlocality that conflicts with special relativity, are resolved in the consistent or decoherent histories interpretation of quantum mechanics by using a modified form of quantum logic to discuss quantum properties (subspaces of the quantum Hilbert space), and treating quantum time development as a stochastic process.
Abstract: It is shown how all the major conceptual difficulties of standard (textbook) quantum mechanics, including the two measurement problems and the (supposed) nonlocality that conflicts with special relativity, are resolved in the consistent or decoherent histories interpretation of quantum mechanics by using a modified form of quantum logic to discuss quantum properties (subspaces of the quantum Hilbert space), and treating quantum time development as a stochastic process. The histories approach in turn gives rise to some conceptual difficulties, in particular the correct choice of a framework (probabilistic sample space) or family of histories, and these are discussed. The central issue is that the principle of unicity, the idea that there is a unique single true description of the world, is incompatible with our current understanding of quantum mechanics.

37 citations


Journal ArticleDOI
TL;DR: In this paper, the concept of relation compatibility was extended to all kinds of possible measurement devices, and a more stringent notion of incompatibility, strong incompatibility was introduced, where two devices are incompatible if they cannot be implemented as parts of a single measurement setup.
Abstract: The fact that there are quantum observables without a simultaneous measurement is one of the fundamental characteristics of quantum mechanics. In this work we expand the concept of joint measurability to all kinds of possible measurement devices, and we call this relation compatibility. Two devices are incompatible if they cannot be implemented as parts of a single measurement setup. We introduce also a more stringent notion of incompatibility, strong incompatibility. Both incompatibility and strong incompatibility are rigorously characterized and their difference is demonstrated by examples.

33 citations


Journal ArticleDOI
TL;DR: In this article, the first experimental realization of this paradox was reported, involving the dissociation of low-pressure hydrogen gas on high-temperature refractory metals (tungsten and rhenium) under blackbody cavity conditions.
Abstract: In 2000, a simple, foundational thermodynamic paradox was proposed: a sealed blackbody cavity contains a diatomic gas and a radiometer whose apposing vane surfaces dissociate and recombine the gas to different degrees (A $$_{2} \rightleftharpoons $$ 2A). As a result of differing desorption rates for A and A $$_{2}$$ , there arise between the vane faces permanent pressure and temperature differences, either of which can be harnessed to perform work, in apparent conflict with the second law of thermodynamics. Here we report on the first experimental realization of this paradox, involving the dissociation of low-pressure hydrogen gas on high-temperature refractory metals (tungsten and rhenium) under blackbody cavity conditions. The results, corroborated by other laboratory studies and supported by theory, confirm the paradoxical temperature difference and point to physics beyond the traditional understanding of the second law.

32 citations


Journal ArticleDOI
TL;DR: The present work questions the validity of claims offered by CH proponents asserting that it solves many interpretational problems in quantum mechanics, and critically examines the proposed notion of a realm-dependent reality.
Abstract: The Consistent Histories (CH) formalism aims at a quantum mechanical framework which could be applied even to the universe as a whole. CH stresses the importance of histories for quantum mechanics, as opposed to measurements, and maintains that a satisfactory formulation of quantum mechanics allows one to assign probabilities to alternative histories of a quantum system. It further proposes that each realm, that is, each set of histories to which probabilities can be assigned, provides a valid quantum-mechanical account, but that different realms can be mutually incompatible. Finally, some of its proponents offer an “evolutionary” explanation of our existence in the universe and of our preference for quasiclassical descriptions of nature. The present work questions the validity of claims offered by CH proponents asserting that it solves many interpretational problems in quantum mechanics. In particular, we point out that the interpretation of the framework leaves vague two crucial points, namely, whether realms are fixed or chosen and the link between measurements and histories. Our claim is that by doing so, CH overlooks the main interpretational problems of quantum mechanics. Furthermore, we challenge the evolutionary explanation offered and we critically examine the proposed notion of a realm-dependent reality.

Journal ArticleDOI
TL;DR: In this paper, the authors show that a set of probabilistic connections is not compatible with correlations violating QM, then it is compatible only with the classical-mechanical correlations, and that there are no subsets of the spin variables whose distributions can be fixed to be compatible with and only with QM-compliant correlations.
Abstract: Correlations of spins in a system of entangled particles are inconsistent with Kolmogorov’s probability theory (KPT), provided the system is assumed to be non-contextual. In the Alice–Bob EPR paradigm, non-contextuality means that the identity of Alice’s spin (i.e., the probability space on which it is defined as a random variable) is determined only by the axis $$\alpha _{i}$$ chosen by Alice, irrespective of Bob’s axis $$\beta _{j}$$ (and vice versa). Here, we study contextual KPT models, with two properties: (1) Alice’s and Bob’s spins are identified as $$A_{ij}$$ and $$B_{ij}$$ , even though their distributions are determined by, respectively, $$\alpha _{i}$$ alone and $$\beta _{j}$$ alone, in accordance with the no-signaling requirement; and (2) the joint distributions of the spins $$A_{ij},B_{ij}$$ across all values of $$\alpha _{i},\beta _{j}$$ are constrained by fixing distributions of some subsets thereof. Of special interest among these subsets is the set of probabilistic connections, defined as the pairs $$\left( A_{ij},A_{ij'}\right) $$ and $$\left( B_{ij},B_{i'j}\right) $$ with $$\alpha _{i} ot =\alpha _{i'}$$ and $$\beta _{j} ot =\beta _{j'}$$ (the non-contextuality assumption is obtained as a special case of connections, with zero probabilities of $$A_{ij} ot =A_{ij'}$$ and $$B_{ij} ot =B_{i'j}$$ ). Thus, one can achieve a complete KPT characterization of the Bell-type inequalities, or Tsirelson’s inequalities, by specifying the distributions of probabilistic connections compatible with those and only those spin pairs $$\left( A_{ij},B_{ij}\right) $$ that are subject to these inequalities. We show, however, that quantum-mechanical (QM) constraints are special. No-forcing theorem says that if a set of probabilistic connections is not compatible with correlations violating QM, then it is compatible only with the classical–mechanical correlations. No-matching theorem says that there are no subsets of the spin variables $$A_{ij},B_{ij}$$ whose distributions can be fixed to be compatible with and only with QM-compliant correlations.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that reduced states are not the quantum states of subsystems in the same sense as quantum states are states of the whole composite system, and for this reason they cancel the correlations of the subsystem with other subsystems with which it interacts or is entangled.
Abstract: The interpretation of the concept of reduced state is a subtle issue that has relevant consequences when the task is the interpretation of quantum mechanics itself. The aim of this paper is to argue that reduced states are not the quantum states of subsystems in the same sense as quantum states are states of the whole composite system. After clearly stating the problem, our argument is developed in three stages. First, we consider the phenomenon of environment-induced decoherence as an example of the case in which the subsystems interact with each other; we show that decoherence does not solve the measurement problem precisely because the reduced state of the measuring apparatus is not its quantum state. Second, the non-interacting case is illustrated in the context of no-collapse interpretations, in which we show that certain well-known experimental results cannot be accounted for due to the fact that the reduced states of the measured system and the measuring apparatus are conceived as their quantum states. Finally, we prove that reduced states are a kind of coarse-grained states, and for this reason they cancel the correlations of the subsystem with other subsystems with which it interacts or is entangled.

Journal ArticleDOI
TL;DR: The gravity-related model of spontaneous wave function collapse, a longtime hypothesis, damps the massive Schrodinger Cat states in quantum theory as discussed by the authors, has been extended and assumed to be responsible for the emergence of Newton interaction.
Abstract: The gravity-related model of spontaneous wave function collapse, a longtime hypothesis, damps the massive Schrodinger Cat states in quantum theory. We extend the hypothesis and assume that spontaneous wave function collapses are responsible for the emergence of Newton interaction. Superfluid helium would then show significant and testable gravitational anomalies.

Journal ArticleDOI
TL;DR: In this article, the authors show that the principle of maximum caliber (Jaynes, Phys Rev 106:620-630, 1957) leads to Newton's second law under two quite intuitive assumptions (both the expected square displacement in one step and the spatial probability distribution of the particle are known at all times).
Abstract: The foundations of Statistical Mechanics can be recovered almost in their entirety from the principle of maximum entropy. In this work we show that its non-equilibrium generalization, the principle of maximum caliber (Jaynes, Phys Rev 106:620–630, 1957), when applied to the unknown trajectory followed by a particle, leads to Newton’s second law under two quite intuitive assumptions (both the expected square displacement in one step and the spatial probability distribution of the particle are known at all times). Our derivation explicitly highlights the role of mass as an emergent measure of the fluctuations in velocity (inertia) and the origin of potential energy as a manifestation of spatial correlations. According to our findings, the application of Newton’s equations is not limited to mechanical systems, and therefore could be used in modelling ecological, financial and biological systems, among others.

Journal ArticleDOI
TL;DR: Costa et al. as discussed by the authors, Newton da Consejo Nacional de Investigaciones Cientificas and Tecnicas; Argentina Universidad de Buenos Aires Facultad de Ciencias Exactas y Naturales; Argentina
Abstract: Fil: Costa, Newton da Consejo Nacional de Investigaciones Cientificas y Tecnicas; Argentina Universidad de Buenos Aires Facultad de Ciencias Exactas y Naturales; Argentina

Journal ArticleDOI
TL;DR: In this paper, the authors show how to transform systems with pairs of integer-valued, commuting operators to systems with real-valued canonical coordinates and their associated momentum operators, where the discrete system could be entirely deterministic while the corresponding (p, q) system could still be typically quantum mechanical.
Abstract: Standard canonical quantum mechanics makes much use of operators whose spectra cover the set of real numbers, such as the coordinates of space, or the values of the momenta. Discrete quantum mechanics uses only strictly discrete operators. We show how one can transform systems with pairs of integer-valued, commuting operators \(P_i\) and \(Q_i\), to systems with real-valued canonical coordinates \(q_i\) and their associated momentum operators \(p_i\). The discrete system could be entirely deterministic while the corresponding (p, q) system could still be typically quantum mechanical.

Journal ArticleDOI
TL;DR: In this paper, an operator constraint equation for the wave function of the universe that admits genuine evolution is proposed, where the initial data in the action principle of General Relativity is given by a conformal geometry and the spatial average of the York time on the spacelike hypersurfaces that bound the variation.
Abstract: We propose an operator constraint equation for the wavefunction of the Universe that admits genuine evolution. While the corresponding classical theory is equivalent to the canonical decomposition of General Relativity, the quantum theory contains an evolution equation distinct from standard Wheeler–DeWitt cosmology. Furthermore, the local symmetry principle—and corresponding observables—of the theory have a direct interpretation in terms of a conventional gauge theory, where the gauge symmetry group is that of spatial conformal diffeomorphisms (that preserve the spatial volume of the Universe). The global evolution is in terms of an arbitrary parameter that serves only as an unobservable label for successive states of the Universe. Our proposal follows unambiguously from a suggestion of York whereby the independently specifiable initial data in the action principle of General Relativity is given by a conformal geometry and the spatial average of the York time on the spacelike hypersurfaces that bound the variation. Remarkably, such a variational principle uniquely selects the form of the constraints of the theory so that we can establish a precise notion of both symmetry and evolution in quantum gravity.

Journal ArticleDOI
TL;DR: In this paper, a simple, causally deterministic model of quantum measurement based on an amplitude threshold detection scheme was proposed, which is found to reproduce many phenomena normally thought to be uniquely quantum in nature.
Abstract: This paper describes a simple, causally deterministic model of quantum measurement based on an amplitude threshold detection scheme. Surprisingly, it is found to reproduce many phenomena normally thought to be uniquely quantum in nature. To model an $$N$$ -dimensional pure state, the model uses $$N$$ complex random variables given by a scaled version of the wave vector with additive complex noise. Measurements are defined by threshold crossings of the individual components, conditioned on single-component threshold crossings. The resulting detection probabilities match or approximate those predicted by quantum mechanics according to the Born rule. Nevertheless, quantum phenomena such as entanglement, contextuality, and violations of Bell’s inequality under local measurements are all shown to be exhibited by the model, thereby demonstrating that such phenomena are not without classical analogs.

Journal ArticleDOI
TL;DR: In this paper, it was shown that Schrodinger's wave mechanics cannot be equivalent to Heisenberg's more physically motivated matrix mechanics unless its observables are quantized using this rule, and not the more symmetric prescription proposed by Weyl in 1926.
Abstract: The aim of the famous Born and Jordan 1925 paper was to put Heisenberg’s matrix mechanics on a firm mathematical basis. Born and Jordan showed that if one wants to ensure energy conservation in Heisenberg’s theory it is necessary and sufficient to quantize observables following a certain ordering rule. One apparently unnoticed consequence of this fact is that Schrodinger’s wave mechanics cannot be equivalent to Heisenberg’s more physically motivated matrix mechanics unless its observables are quantized using this rule, and not the more symmetric prescription proposed by Weyl in 1926, which has become the standard procedure in quantum mechanics. This observation confirms the superiority of Born–Jordan quantization, as already suggested by Kauffmann. We also show how to explicitly determine the Born–Jordan quantization of arbitrary classical variables, and discuss the conceptual advantages in using this quantization scheme. We finally suggest that it might be possible to determine the correct quantization scheme by using the results of weak measurement experiments.

Journal ArticleDOI
TL;DR: In this paper, the Hartman effect was shown to be universal for elastic fields as well as for electromagnetic fields, and it was shown that the tunneling time is the same for the reflected and the transmitted waves in the case of symmetric barriers.
Abstract: Fifty years ago Hartman studied the barrier transmission time of wave packets (J Appl Phys 33:3427–3433, 1962). He was inspired by the tunneling experiments across thin insulating layers at that time. For opaque barriers he calculated faster than light propagation and a transmission time independent of barrier length, which is called the Hartman effect. A faster than light (FTL or superluminal) wave packet velocity was deduced in analog tunneling experiments with microwaves and with infrared light thirty years later. Recently, the conjectured zero time of electron tunneling was claimed to have been observed in ionizing helium inside the barrier. The calculated and measured short tunneling time arises at the barrier front. This tunneling time was found to be universal for elastic fields as well as for electromagnetic fields. Remarkable is that the delay time is the same for the reflected and the transmitted waves in the case of symmetric barriers. Several theoretical physicists predicted this strange nature of the tunneling process. However, even with this background many members of the physics community do not accept a FTL signal velocity interpretation of the experimental tunneling results. Instead a luminal front velocity was calculated to explain the FTL experimental results frequently. However, Brillouin stated in his book on wave propagation and group velocity that the front velocity is given by the group velocity of wave packets in the case of physical signals, which have only finite frequency bandwidths. Some studies assumed barriers to be cavities and the observed tunneling time does represent the cavity lifetime. We are going to discus these continuing misleading interpretations, which are found in journals and in textbooks till today.

Journal ArticleDOI
TL;DR: In this article, a massive particle is modelled as a standing wave in three dimensions, and as the particle moves, the standing wave becomes a travelling wave having two factors: one is a carrier wave displaying the dilated frequency and contracted ellipsoidal form described by the Lorentz transformation, while the other (identified as the de Broglie wave) is a modulation defining the dephasing of the carrier wave in the direction of travel.
Abstract: The Lorentz transformation (LT) is explained by changes occurring in the wave characteristics of matter as it changes inertial frame. This explanation is akin to that favoured by Lorentz, but informed by later insights, due primarily to de Broglie, regarding the underlying unity of matter and radiation. To show the nature of these changes, a massive particle is modelled as a standing wave in three dimensions. As the particle moves, the standing wave becomes a travelling wave having two factors. One is a carrier wave displaying the dilated frequency and contracted ellipsoidal form described by the LT, while the other (identified as the de Broglie wave) is a modulation defining the dephasing of the carrier wave (and thus the failure of simultaneity) in the direction of travel. The superluminality of the de Broglie wave is thus explained, as are several other mysterious features of the optical behaviour of matter, including the physical meaning of the Schrodinger equation and the relevance to scattering processes of the de Broglie wave vector. Consideration is given to what this Lorentzian approach to relativity might mean for the possible existence of a preferred frame and the origin of the observed Minkowski metric.

Journal ArticleDOI
TL;DR: In this article, it is shown that whenever the calculation of probabilities calls for the addition of amplitudes, the distinctions we make between the alternatives lack objective reality, which implies that, owing to the indefiniteness of positions, the existence of a real-valued spatio-temporal background is an unrealistic idealization.
Abstract: In resisting attempts to explain the unity of a whole in terms of a multiplicity of interacting parts, quantum mechanics calls for an explanatory concept that proceeds in the opposite direction: from unity to multiplicity. Being part of the Scientific Image of the world, the theory concerns the process by which (the physical aspect of) what Sellars called the Manifest Image of the world comes into being. This process consists in the progressive differentiation of an intrinsically undifferentiated entity. By entering into reflexive spatial relations, this entity gives rise to (i) what looks like a multiplicity of relata if the reflexive quality of the relations is not taken into account, and (ii) what looks like a substantial expanse if the spatial quality of the relations is reified. If there is a distinctly quantum domain, it is a non-spatial and non-temporal dimension across which the transition from the unity of this entity to the multiplicity of the world takes place. Instead of being constituents of the physical world, subatomic particles, atoms, and molecules are instrumental in its manifestation. These conclusions are based on the following interpretive principle and its more direct consequences: whenever the calculation of probabilities calls for the addition of amplitudes, the distinctions we make between the alternatives lack objective reality. Applied to alternatives involving distinctions between regions of space, this principle implies that, owing to the indefiniteness of positions, the spatiotemporal differentiation of the physical world is incomplete: the existence of a real-valued spatiotemporal background is an unrealistic idealization. This guarantees the existence of observables whose values are real per se, as against “real by virtue of being indicated by the values of observables that are real per se.” Applied to alternatives involving distinctions between things, it implies that, intrinsically, all fundamental particles are numerically identical and thus identifiable with the aforementioned undifferentiated entity.

Journal ArticleDOI
TL;DR: In this article, Bancal et al. discuss models that attempt to provide an explanation for the violation of Bell inequalities at a distance in terms of hidden influences, which can thus be proved to allow for faster-than-light communication.
Abstract: We discuss models that attempt to provide an explanation for the violation of Bell inequalities at a distance in terms of hidden influences. These models reproduce the quantum correlations in most situations, but are restricted to produce local correlations in some configurations. The argument presented in (Bancal et al. Nat Phys 8:867, 2012) applies to all of these models, which can thus be proved to allow for faster-than-light communication. In other words, the signalling character of these models cannot remain hidden.

Journal ArticleDOI
TL;DR: In this paper, it was shown that for any bipartite pure entangled state, with Schmidt rank greater than two, its non-locality follows from preparation contextuality and steerability provided certain conditions on the epistemicity of the underlying ontological model.
Abstract: The ontological model framework for an operational theory has generated much interest in recent years. The debate concerning reality of quantum states has been made more precise in this framework. With the introduction of generalized notion of contextuality in this framework, it has been shown that completely mixed state of a qubit is preparation contextual. Interestingly, this new idea of preparation contextuality has been used to demonstrate nonlocality of some \(\psi \)-epistemic models without any use of Bell’s inequality. In particular, nonlocality of a non maximally \(\psi \)-epistemic model has been demonstrated from preparation contextuality of a maximally mixed qubit and Schrodinger’s steerability of the maximally entangled state of two qubits (Leifer and Maroney, Phys Rev Lett 110:120401, 2013). In this paper, we, show that any mixed state is preparation contextual. We, then, show that nonlocality of any bipartite pure entangled state, with Schmidt rank two, follows from preparation contextuality and steerability provided we impose certain condition on the epistemicity of the underlying ontological model. More interestingly, if the pure entangled state is of Schmidt rank greater than two, its nonlocality follows without any further condition on the epistemicity. Thus our result establishes a stronger connection between nonlocality and preparation contextuality by revealing nonlocality of any bipartite pure entangled states without any use of Bell-type inequality.

Journal ArticleDOI
TL;DR: In this article, the Bohmian approach to quantum physics is applied to develop a clear and coherent ontology of non-perturbative quantum gravity, where atoms of space, represented in terms of nodes linked by edges in a graph, are considered as the primitive ontology.
Abstract: The paper shows how the Bohmian approach to quantum physics can be applied to develop a clear and coherent ontology of non-perturbative quantum gravity. We suggest retaining discrete objects as the primitive ontology also when it comes to a quantum theory of space-time and therefore focus on loop quantum gravity. We conceive atoms of space, represented in terms of nodes linked by edges in a graph, as the primitive ontology of the theory and show how a non-local law in which a universal and stationary wave-function figures can provide an order of configurations of such atoms of space such that the classical space-time of general relativity is approximated. Although there is as yet no fully worked out physical theory of quantum gravity, we regard the Bohmian approach as setting up a standard that proposals for a serious ontology in this field should meet and as opening up a route for fruitful physical and mathematical investigations.

Journal ArticleDOI
TL;DR: In this paper, the authors show that the CHSH inequality holds even for completely general state variables in the measure-theoretic setting, and demonstrate how to drop the assumption of independence of subsequent trials while still being able to perform a hypothesis test that will distinguish Quantum Mechanics from local theories.
Abstract: The Clauser–Horne–Shimony–Holt (CHSH) inequality is a constraint that local hidden variable theories must obey. Quantum Mechanics predicts a violation of this inequality in certain experimental settings. Treatments of this subject frequently make simplifying assumptions about the probability spaces available to a local hidden variable theory, such as assuming the state of the system is a discrete or absolutely continuous random variable, or assuming that repeated experimental trials are independent and identically distributed. In this paper, we do two things: first, show that the CHSH inequality holds even for completely general state variables in the measure-theoretic setting, and second, demonstrate how to drop the assumption of independence of subsequent trials while still being able to perform a hypothesis test that will distinguish Quantum Mechanics from local theories. The statistical strength of such a test is computed.

Journal ArticleDOI
TL;DR: In this article, the authors argue that the Unruh effect does not represent any novel physics and that, by its very nature, the effect is fundamentally unmeasurable in all experiments of the kind that have been contemplated until now.
Abstract: There is a persistent state of confusion regarding the nature of the Unruh effect. We will argue that, in contrast to some interpretations thereof, the effect does not represent any novel physics and that, by its very nature, the effect is fundamentally unmeasurable in all experiments of the kind that have been contemplated until now. Also, we discuss what aspects connected with this effect one might consider as possibilities to be explored empirically and what their precise meaning may be regarding the issue at hand.

Journal ArticleDOI
TL;DR: In this article, a new class of experiments designed to probe the foundations of quantum mechanics is described, where quantum controlling devices are used to attain a freedom in temporal ordering of the control and detection of various phenomena.
Abstract: We describe a new class of experiments designed to probe the foundations of quantum mechanics. Using quantum controlling devices, we show how to attain a freedom in temporal ordering of the control and detection of various phenomena. We consider wave–particle duality in the context of quantum-controlled and the entanglement-assisted delayed-choice experiments. Then we discuss a quantum-controlled CHSH experiment and measurement of photon’s transversal position and momentum in a single set-up.

Journal ArticleDOI
TL;DR: In this paper, a reformulation of Bell's Theorem from physics 36:1-28 (1964) and the Strong Free Will Theorem of Conway and Kochen from Notices AMS 56:226-232 (2009) is presented, in which the ontological state initially determines both the settings and the outcome of the experiment.
Abstract: Bell’s Theorem from Physics 36:1–28 (1964) and the (Strong) Free Will Theorem of Conway and Kochen from Notices AMS 56:226–232 (2009) both exclude deterministic hidden variable theories (or, in modern parlance, ‘ontological models’) that are compatible with some small fragment of quantum mechanics, admit ‘free’ settings of the archetypal Alice and Bob experiment, and satisfy a locality condition akin to parameter independence. We clarify the relationship between these theorems by giving reformulations of both that exactly pinpoint their resemblance and their differences. Our reformulation imposes determinism in what we see as the only consistent way, in which the ‘ontological state’ initially determines both the settings and the outcome of the experiment. The usual status of the settings as ‘free’ parameters is subsequently recovered from independence assumptions on the pertinent (random) variables. Our reformulation also clarifies the role of the settings in Bell’s later generalization of his theorem to stochastic hidden variable theories.