scispace - formally typeset
Search or ask a question

Showing papers on "Consistent histories published in 2003"


Journal ArticleDOI
TL;DR: In this article, it is argued that the macroworld is instead to be understood in terms of certain structures and patterns which emerge from quantum theory given appropriate dynamics, in particular decoherence.
Abstract: I address the problem of indefiniteness in quantum mechanics: the problem that the theory, without changes to its formalism, seems to predict that macroscopic quantities have no definite values. The Everett interpretation is often criticised along these lines and I shall argue that much of this criticism rests on a false dichotomy: that the macroworld must either be written directly into the formalism or be regarded as somehow illusory. By means of analogy with other areas of physics, I develop the view that the macroworld is instead to be understood in terms of certain structures and patterns which emerge from quantum theory (given appropriate dynamics, in particular decoherence). I extend this view to the observer, and in doing so make contact with functionalist theories of mind.

191 citations


Journal ArticleDOI
TL;DR: In this article, the classical and quantum features of Nambu mechanics are analyzed and fundamental issues are resolved, and the quantum theory is discussed in a parallel presentation and illustrated with detailed specific cases.
Abstract: The classical and quantum features of Nambu mechanics are analyzed and fundamental issues are resolved. The classical theory is reviewed and developed utilizing varied examples. The quantum theory is discussed in a parallel presentation and illustrated with detailed specific cases. Quantization is carried out with standard Hilbert space methods. With the proper physical interpretation, obtained by allowing for different time scales on different invariant sectors of a theory, the resulting non-Abelian approach to quantum Nambu mechanics is shown to be fully consistent.

164 citations


Journal ArticleDOI
TL;DR: The aim is to provide an account of the peculiarities of quantum probability in this framework, intimately related to and inspired by the foundational work on quantum information of Fuchs.
Abstract: The Bayesian approach takes probability to be a measure of ignorance, reflecting our state of knowledge and not merely the state of the world. It follows Ramsey’s contention that “we have the authority both of ordinary language and of many great thinkers for discussing under the heading of probability… the logic of partial belief” (Ramsey 1926. Truth and probability. Cambridge: Cambridge University Press, p. 55). Here we shall assume, furthermore, that probabilities are revealed in rational betting behavior: “The old-established way of measuring a person’s belief … by proposing a bet, and see what are the lowest odds which he will accept, is fundamentally sound.” My aim is to provide an account of the peculiarities of quantum probability in this framework. The approach is intimately related to and inspired by the foundational work on quantum information of Fuchs (2001, Quantum mechanics as quantum information (and only a little more). Quant-ph 0205039), Schack et al. (2001, Physical Review A64 014305: 1–4) and Caves et al. (2002, Physical Review A 65(2305): 1–6).

100 citations


Journal ArticleDOI
Tian Yu Cao1
01 Jul 2003-Synthese
TL;DR: To address Kuhn’s claim, or more generally to develop a realist conception of science and a cognitively progressive conception of history of science, a mere appeal to formal logic or empiricism is not of great help.
Abstract: When I started working on the history and philosophy of science at Cambridge, under the guidance of Mary Hesse and Michael Redhead, my major concern was with Kuhn’s revolutionary view of the history of science. In particular, I tried to address Kuhn’s claim that “I can see no coherent direction of ontological development” in the history of science (Kuhn, 1970). If Kuhn was right, then in our conception of the history of science, there would be no room for cognitive progress, in the sense of the accumulation of our objective knowledge of what exists and happens in the world, although a kind of instrumental progress, in terms of our ability to solve puzzles, would still be imaginable. The anti-realist implication of Kuhn’s view can be best seen through Hilary Putnam’s meta-induction thesis: if no theory in the history can be taken as true from the viewpoint of later theories, then there is no reason to believe that our present theories would enjoy any privilege over their predecessors.2 In order to address Kuhn’s claim, or more generally to develop a realist conception of science and a cognitively progressive conception of history of science, a mere appeal to formal logic or empiricism is not of great help. According to Carnap (1950, 1956), formal logic is unable to address what he calls the external questions that are related to radical changes of conceptual framework. And it is a truism that empiricism has no theoretical resource to deal with the underdetermination thesis, which challenges the status of empirical evidence as a bridge connecting theoretical entities and physical reality. Taking the history of 20th century physics as an example, what was required, it seemed to me, was a conceptual analysis of its theoretical structures and their evolution, which aimed at a clarification of what the basic ontology is for the discipline and its replacement. Then, with a structural understanding of ontology and a realist understanding of structural knowledge, we would be able to, first, make a realist claim that

68 citations


Journal ArticleDOI
TL;DR: In this article, the role of context, complex of physical conditions, in quantum as well as classical experiments is studied and it is shown that by taking into account contextual dependence of experimental probabilities, one can derive the quantum rule for the addition of probabilities of alternatives.
Abstract: We study the role of context, complex of physical conditions, in quantum as well as classical experiments. It is shown that by taking into account contextual dependence of experimental probabilities we can derive the quantum rule for the addition of probabilities of alternatives. Thus we obtain quantum interference without applying the wave or Hilbert space approach. The Hilbert space representation of contextual probabilities is obtained as a consequence of the elementary geometric fact: cos-theorem. By using another fact from elementary algebra we obtain complex-amplitude representation of probabilities. Finally, we found contextual origin of noncommutativity of incompatible observables.

60 citations


Journal ArticleDOI
TL;DR: In this paper, a deck of playing cards is used as a metaphor for quantum mechanics of incompatibility and value-indeterminateness of variables, the non-existence of dispersion-free states, the failure of the standard marginal probability formula, and the distributive law of disjunction and interference.
Abstract: A number of phenomena generally believed characteristic of quantum mechanics and seen as interpretively problematic—the incompatibility and value-indeterminateness of variables, the non-existence of dispersion-free states, the failure of the standard marginal-probability formula, the failure of the distributive law of disjunction and interference—are exemplified in an emphatically non-quantal system: a deck of playing cards. Thus the appearance in quantum mechanics of incompatibility and these associated phenomena requires neither explanation nor interpretation.

47 citations


Journal ArticleDOI
Mario Bunge1
TL;DR: In this article, it is shown that the idea of a quantum or minimal unit is not peculiar to quantum theory, since it already occurs in the classical theories of elasticity and electrolysis, and that the orthodox or Copenhagen interpretation of the theory is false and may conveniently be replaced with a realist (though not classicist) interpretation.
Abstract: Three main theses are proposed. The first is that the idea of a quantum or minimal unit is not peculiar to quantum theory, since it already occurs in the classical theories of elasticity and electrolysis. Second, the peculiarities of the objects described by quantum theory are the following: their basic laws are probabilistic; some of their properties, such as position and energy, are blunt rather than sharp; two particles that were once together continue to be associated even after becoming spatially separated; and the vacuum has physical properties, so that it is a kind of matter. Third, the orthodox or Copenhagen interpretation of the theory is false, and may conveniently be replaced with a realist (though not classicist) interpretation. Heisenberg's inequality, Schrodinger's cat and Zeno's quantum paradox are discussed in the light of the two rival interpretations. It is also shown that the experiments that falsified Bell's inequality do not refute realism but the classicism inherent in hidden variables theories.

43 citations


Journal ArticleDOI
TL;DR: In this paper, a density matrix obtained by deformation of quantum-mechanical density matrix is named density pro-matrix, which can be used to describe the dynamics of black holes.
Abstract: In this paper Quantum Mechanics with Fundamental Length is chosen as Quantum Mechanics at Planck's scale. This is possible due to the presence in the theory of General Uncertainty Relations. Here Quantum Mechanics with Fundamental Length is obtained as a deformation of Quantum Mechanics. The distinguishing feature of the proposed approach in comparison with previous ones, lies on the fact that here density matrix subjects to deformation whereas so far commutators have been deformed. The density matrix obtained by deformation of quantum-mechanical density one is named throughout this paper density pro-matrix. Within our approach two main features of Quantum Mechanics are conserved: the probabilistic interpretation of the theory and the well-known measuring procedure corresponding to that interpretation. The proposed approach allows to describe dynamics. In particular, the explicit form of deformed Liouville's equation and the deformed Shr\"odinger's picture are given. Some implications of obtained results are discussed. In particular, the problem of singularity, the hypothesis of cosmic censorship, a possible improvement of the definition of statistical entropy and the problem of information loss in black holes are considered. It is shown that obtained results allow to deduce in a simple and natural way the Bekenstein-Hawking's formula for black hole entropy in semiclassical approximation.

40 citations


Journal ArticleDOI
TL;DR: In this article, a unified description of both probabilities and phases comes through generalisation of the notion of a density matrix for histories; this object is the decoherence functional introduced by the consistent histories approach.

33 citations


Journal ArticleDOI
TL;DR: In this article, the existence of probability in the sense of the frequency interpretation, i.e., probability as "long term relative frequency", is shown from the dynamics and the interpretational rules of Everett quantum mechanics in the Heisenberg picture.
Abstract: The existence of probability in the sense of the frequency interpretation, i.e., probability as “long term relative frequency,” is shown to follow from the dynamics and the interpretational rules of Everett quantum mechanics in the Heisenberg picture. This proof is free of the difficulties encountered in applying to the Everett interpretation previous results regarding relative frequency and probability in quantum mechanics. The ontology of the Everett interpretation in the Heisenberg picture is also discussed.

32 citations


Posted Content
TL;DR: In this paper, a deformed density matrix is proposed as a deformation of the density matrix of a density matrix subject to deformation as well as so far commutators had been deformed.
Abstract: In this paper Quantum Mechanics with Fundamental Length is chosen as the theory for describing the early Universe. This is possible due to the presence in the theory of General Uncertainty Relations from which unavoidable it follows that in nature a fundamental length exits. Here Quantum Mechanics with Fundamental Length is obtained as a deformation of Quantum Mechanics. The distinguishing feature of the proposed in this paper approach in comparison with previous ones, lies on the fact that here density matrix subjects to deformation as well as so far commutators had been deformed. The deformed density matrix mentioned above, is named throughout this paper density pro-matrix. Within our approach two main features of Quantum Mechanics are conserved: the probabilistic interpretation of the theory and exact predefined measuring procedure corresponding to that interpretation. The proposed here approach allows to describe dynamics. In particular, the explicit form of deformed Liouville's equation and the deformed Shr\"odinger's picture are given. Some implications of obtained results are discussed. In particular, the problem of singularity, the hypothesis of cosmic censorship, a possible improvement of the statistical entropy definition and the problem of information loss in black holes are considered.

Posted Content
TL;DR: In this paper, a natural generalization of the auxiliary assumption is presented, which is actually strong enough to yield the Born rule itself, but it can be adapted to do without this assumption, at the cost of using envariance of probability in both directions.
Abstract: Zurek has derived the quantum probabilities for Schmidt basis states of bipartite quantum systems in pure joint states, from the assumption that they should be not be affected by one party's action if the action can be undone by the other party (``envariance of probability'') and an auxiliary assumption. We argue that a natural generalization of the auxiliary assumption is actually strong enough to yield the Born rule itself, but that Zurek's argument and protocol can be adapted to do without this assumption, at the cost of using envariance of probability in both directions. We consider alternative motivations for envariance, one based on the no-signalling constraint that actions on one subsystem of a quantum system not allow signalling to another subsystem entirely distinct from the first, and another which is perhaps strongest in the context of a relative-state interpretation of quantum mechanics. In part because of this, we argue that the relative appeal of our version and the original version of Zurek's argument depends in part upon whether one interprets the quantum formalism in terms of relative states or definite measurement outcomes.

01 Jan 2003
TL;DR: The Dirac equation has a hidden geometric structure that is made manifest by reformulating it in terms of a real spacetime algebra as mentioned in this paper, which reveals an essential connection between spin and complex numbers with profound implications for the interpretation of quantum mechanics.
Abstract: The Dirac equation has a hidden geometric structure that is made manifest by reformulating it in terms of a real spacetime algebra. This reveals an essential connection between spin and complex numbers with profound implications for the interpretation of quantum mechanics. Among other things, it suggests that to achieve a complete interpretation of quantum mechanics, spin should be identifled with an intrinsic zitterbewegung.

Journal ArticleDOI
TL;DR: In this article, an elementary model is given which shows how an objective (hence local and noncontextual) picture of the microworld can be constructed without conflicting with quantum mechanics (QM).
Abstract: An elementary model is given which shows how an objective (hence local and noncontextual) picture of the microworld can be constructed without conflicting with quantum mechanics (QM). This contradicts known no-go theorems, which however do not hold in the model, and supplies some suggestions for a broader theory in which QM can be embedded.

Book ChapterDOI
01 Jan 2003
TL;DR: In this paper, the Copenhagen interpretation is reviewed and the importance of an unbiased attitude on the interpretational side for future progress in physics is emphasized. But it is also pointed out that the emergence of classical properties can be understood within the framework of quantum theory itself, through the process of decoherence.
Abstract: A central feature in the Copenhagen interpretation is the use of classical concepts from the outset. Modern developments show, however, that the emergence of classical properties can be understood within the framework of quantum theory itself, through the process of decoherence. This fact becomes most crucial for the interpretability of quantum cosmology — the application of quantum theory to the Universe as a whole. I briefly review these developments and emphasize the importance of an unbiased attitude on the interpretational side for future progress in physics.

Journal ArticleDOI
TL;DR: Self–organization is clearly relevant to biology, chemistry, Earth science, economics and other sciences that have to deal with big and complicated issues and self-organization relates to other modes of explanation such as reductionism.
Abstract: Self-organization is clearly relevant to biology, chemistry, Earth science, economics and other sciences that have to deal with big and complicated issues. This paper shows that self-organization also has a great deal to do with fundamental physics, including quantum mechanics, relativity, quantum gravity and cosmology. This paper also aims to give some insight into what self-organization means and discusses questions such as the kinds of methods that can be used to understand self-organization and how self-organization relates to other modes of explanation such as reductionism.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that a certain type of many minds interpretation of quantum mechanics does not provide a coherent interpretation of the quantum mechanical probabilistic algorithm, and they consider Albert and Loewer's probability interpretation in the context of Bell-type and GHZ-type states and argue that it implies a weak form of nonlocality.
Abstract: We argue that a certain type of many minds (and many worlds) interpretations of quantum mechanics, e.g. Lockwood ([1996a]), Deutsch ([1985]) do not provide a coherent interpretation of the quantum mechanical probabilistic algorithm. By contrast, in Albert and Loewer's ([1988]) version of the many minds interpretation there is a coherent interpretation of the quantum mechanical probabilities. We consider Albert and Loewer's probability interpretation in the context of Bell-type and GHZ-type states and argue that it implies a certain (weak) form of nonlocality.

Posted Content
TL;DR: In this paper, the physical content of the PT-symmetric complex extension of quantum mechanics is studied and it is shown that as a fundamental probabilistic physical theory it is neither an alternative to nor an extension of ordinary quantum mechanics.
Abstract: We study the physical content of the PT-symmetric complex extension of quantum mechanics as proposed in Bender et al, Phys. Rev. Lett. 80, 5243 (1998) and 89, 270401 (2002), and show that as a fundamental probabilistic physical theory it is neither an alternative to nor an extension of ordinary quantum mechanics. We demonstrate that the definition of a physical observable given in the above papers is inconsistent with the dynamical aspect of the theory and offer a consistent notion of an observable.

Journal ArticleDOI
TL;DR: Weisskopf had a rare and harmonious blend of sentiment and intellectual rigor as discussed by the authors, and liked to say that his favorite occupations were Mozart and quantum mechanics, and he was an excellent teacher.
Abstract: Weisskopf had a rare and harmonious blend of sentiment and intellectual rigor. He liked to say that his favorite occupations were Mozart and quantum mechanics.

Posted Content
TL;DR: In this paper, it was shown that the classical description of pair production effect is possible to describe pair production without a reference to quantum principles, i.e. one can describe pair-production without any reference to the quantum principles.
Abstract: It is shown that the classical description of pair production effect is possible, i.e. one can describe pair production without a reference to quantum principles. Pair production appears at statistical description of stochastic relativistic particles. There is a special force field which is responsible for pair production. This field is a reasonable consequence of quantum stochasticity. Consideration of quantum systems as stochastic systems and statistical description of them generates hydrodynamic interpretation of quantum phenomena. In the collision problem the hydrodynamic interpretation appears to be alternative to the conventional interpretation, based on quantum principles and on consideration of the wave function as a principal object of dynamics. Hydrodynamic interpretation leads to such a statement of the collision problem, which is alternative to the conventional S-matrix theory.

Journal ArticleDOI
TL;DR: In this paper, the history approach is used to define probabilities which make no reference to a sample space or event algebra (correlations without correlata), which leads to severe conceptual difficulties, which almost inevitably couple quantum theory to unresolved problems of human consciousness.
Abstract: Any attempt to introduce probabilities into quantum mechanics faces difficulties due to the mathematical structure of Hilbert space, as reflected in Birkhoff and von Neumann's proposal for a quantum logic. The (consistent or decoherent) histories solution is provided by its single framework rule, an approach that includes conventional (Copenhagen) quantum theory as a special case. Mermin's Ithaca interpretation addresses the same problem by defining probabilities which make no reference to a sample space or event algebra (“correlations without correlata”). But this leads to severe conceptual difficulties, which almost inevitably couple quantum theory to unresolved problems of human consciousness. Using histories allows a sharper quantum description than is possible with a density matrix, suggesting that the latter provides an ensemble rather than an irreducible single-system description as claimed by Mermin. The histories approach satisfies the first five of Mermin's desiderata for a good interpretation of quantum mechanics, including Einstein locality, but the Ithaca interpretation seems to have difficulty with the first (independence of observers) and the third (describing individual systems).

Journal ArticleDOI
TL;DR: In this article, the status of the arrival time distributions of Allcock and Kijowski and their generalizations, which are all obtained using conventional quantum mechanics, is discussed in the light of some paradoxes pointed out by Leavens and some other recent results on a fluorescence based operational time-of-arrival model.

Journal ArticleDOI
TL;DR: In this paper, the authors show that the composition principle fails in all realist collapse interpretations of quantum mechanics and show that what lies at the heart of the counting anomaly is the failure to appreciate the peculiarities of the property structure of such interpretations.
Abstract: The aim of this article is twofold. Recently, Lewis has presented an argument, now known as the "counting anomaly", that the spontaneous localization approach to quantum mechanics, suggested by Ghirardi, Rimini, and Weber, implies that arithmetic does not apply to ordinary macroscopic objects. I will take this argument as the starting point for a discussion of the property structure of realist collapse interpretations of quantum mechanics in general. At the end of this I present a proof of the fact that the composition principle, which holds true in standard quantum mechanics, fails in all realist collapse interpretations. On the basis of this result I reconsider the counting anomaly and show that what lies at the heart of the anomaly is the failure to appreciate the peculiarities of the property structure of such interpretations. Once this flaw is uncovered, the anomaly vanishes.

Journal ArticleDOI
TL;DR: The Pondicherry interpretation of quantum mechanics as mentioned in this paper is based on the assumption that quantum mechanics is fundamentally a probability algorithm, and this interpretation determines the nature of a world that is irreducibly described by this probability algorithm.
Abstract: Marchildon's (favorable) assessment (quant-ph/0303170, to appear in Found. Phys.) of the Pondicherry interpretation of quantum mechanics raises several issues, which are addressed. Proceeding from the assumption that quantum mechanics is fundamentally a probability algorithm, this interpretation determines the nature of a world that is irreducibly described by this probability algorithm. Such a world features an objective fuzziness, which implies that its spatiotemporal differentiation does not "go all the way down". This result is inconsistent with the existence of an evolving instantaneous state, quantum or otherwise.

Book ChapterDOI
TL;DR: In particular, the concept of quantum state has acquired a direct physical meaning, in terms of properties of a physical system that is fully represented by a linear superposition of eigenstates, and able to propagate as such in space and time as mentioned in this paper.
Abstract: Recent developments in the area of quantum systems have led to accept statements, which originally appeared to be mere interpretations, as representing physical facts that appeared formerly to be more related to interpretation with free options. Of such a nature are the statements relating to quantum behavior of individual particles (diffraction, etc.), neutrinos oscillations, distant quantum correlations (local non-separability), Bose-Einstein condensation, cooling isolation of atoms and, recently, decoherence of quantum superposition states interacting with environment measurement apparatus, that allows a better understanding of the transition from the quantum domain to the classical-macroscopic one. The debate on the interpretation of quantum mechanics has imperceptibly changed its nature through these developments, giving higher weight to a “physical interpretation” more clearly distinct from the philosophical one than in the old days of quantum mechanics. In particular, the concept of quantum state has undoubtedly acquired a direct physical meaning, in terms of properties of a physical system that is fully represented by a linear superposition of eigenstates, and able to propagate as such in space and time. The price for this new situation is an extension of meaning of the concepts of physical magnitude and physical state towards ones that do not correspond directly with numerical values.

Journal ArticleDOI
TL;DR: The physical concept of quantum dynamical internal measuremental robustness is discussed and the significance of introducing affine molecular Hilbert spaces, the original (primordeal) internal quantum measurement, and the global constraining nature of time-inversion symmetry restoring, as a special restoration force are discussed.
Abstract: In the present paper, some physical considerations of the biological symbol-matter problem is exposed. First of all, the physical concept of quantum dynamical internal measuremental robustness is discussed. In this context, the significance of introducing affine molecular Hilbert spaces, the original (primordeal) internal quantum measurement, and the global constraining nature of time-inversion symmetry restoring, as a special restoration force, is discussed at some length. It is pointed out, as a summary, that global robustness of the internal dynamics of quantum measurements is due to two basic factors: on one hand, the global constraining nature of the chosen specific (symmetry-) restoring force, and on the other, the individual robustness of the discrete local internal measuremental interactions. The second condition is supposed to follow from a system-internalised ("objective") Bohr-type Copenhagen interpretation of quantum mechanics, corresponding, in an external context, to the Generalized Complementarity Principle of Bohr and Elsasser. It is not claimed, however, that this latter problem has been, as yet, satisfactorily settled physically. In fact, if it were, it would amount to a specifically biological quantum theory of internal measurement, which had to be rooted in the original primordeal global internal measurement, amounting to the origin of the genetic code.

Book ChapterDOI
01 Jan 2003
TL;DR: In this article, the authors present decoherence in quantum systems as a separation of a quantum system into a subsystem (the relevant part) and its environment (the irrelevant part) provided certain initial conditions hold, which may lead to the emergence of classical behaviour in the relevant part.
Abstract: In our presentation of decoherence in the previous chapters, an essential ingredient was the separation of a quantum system into a subsystem (called the “relevant” or “distinguished” part) and its environment (called the “irrelevant” or “ignored” part) Provided certain initial conditions hold, “coarsegraining” with respect to the irrelevant degrees of freedom may lead to the emergence of classical behaviour in the relevant part, since (almost) all information about quantum phases has migrated into correlations with the environment and is thus no longer accessible in observations of the subsystem alone

Journal ArticleDOI
TL;DR: In this paper, a connection between the probability space measurability requirement and the complementarity principle in quantum mechanics is established, and it is shown that the results of quantum measurement depend not only on properties of the quantum object under consideration but also on classical characteristics of the measuring instruments used.
Abstract: A connection between the probability space measurability requirement and the complementarity principle in quantum mechanics is established. It is shown that measurability of the probability space implies that the results of the quantum measurement depend not only on properties of the quantum object under consideration but also on classical characteristics of the measuring instruments used. It is also shown that if the measurability requirement is taken into account, then the hypothesis that the objective reality exists does not lead to the Bell inequality.

01 Jan 2003
TL;DR: In this paper, the authors argue that Deutsch's derivation of the Born rule from the non-probabilistic part of quantum mechanics and classical decision theory is sufficient and sufficient.
Abstract: A major problem facing no-collapse interpretations of quantum mechanics in the tradition of Everett is how to understand the probabilistic axiom of quantum mechanics (the Born rule) in the context of a deterministic theory in which every outcome of a measurement occurs. Deutsch claims to derive a decision-theoretic analogue of the Born rule from the non-probabilistic part of quantum mechanics and some non-probabilistic axioms of classical decision theory, and hence concludes that no probabilistic axiom is needed. I argue that Deutsch’s derivation begs the question.

Journal ArticleDOI
TL;DR: In this article, the experimental verification of Newtonian mechanics and non-relativistic quantum mechanics do not imply that space is continuous, and they provide evidence against the realist interpretation of the most mathematical parts of physics.
Abstract: We argue that the experimental verification of Newtonian mechanics and of non-relativistic quantum mechanics do not imply that space is continuous. This provides evidence against the realist interpretation of the most mathematical parts of physics.