scispace - formally typeset
Search or ask a question

Showing papers in "Foundations of probability and physics in 2007"


Proceedings ArticleDOI
TL;DR: In this paper, a review of the problem of mutually unbiased bases in finite dimensional Hilbert spaces, real and complex, is presented, and a geometric measure of "mubness" is introduced to some explicit calculations in six dimensions.
Abstract: This is a review of the problem of Mutually Unbiased Bases in finite dimensional Hilbert spaces, real and complex. Also a geometric measure of “mubness” is introduced, and applied to some explicit calculations in six dimensions (partly done by Bjorck and by Grassl). Although this does not yet solve any problem, some appealing structures emerge.

83 citations


Proceedings ArticleDOI
TL;DR: In this article, quantum mechanical uncertainty relations for the position and the momentum and for the angle and the angular momentum are expressed in the form of inequalities involving the Renyi entropies.
Abstract: Quantum mechanical uncertainty relations for the position and the momentum and for the angle and the angular momentum are expressed in the form of inequalities involving the Renyi entropies. These uncertainty relations hold not only for pure but also for mixed states. Analogous uncertainty relations are valid also for a pair of complementary observables (the analogs of x and p) in N‐level systems. All these uncertainty relations become more attractive when expressed in terms of the symmetrized Renyi entropies. The mathematical proofs of all the inequalities discussed in this paper can be found in Phys. Rev. A 74, No. 5 (2006); arXiv:quant‐ph/0608116.

49 citations


Proceedings ArticleDOI
TL;DR: In this paper, the mathematical formulation of quantum mechanics in terms of complex Hilbert space is derived for finite dimensions, starting from a general definition of physical experiment and from five simple Postulates concerning experimental accessibility and simplicity.
Abstract: The mathematical formulation of Quantum Mechanics in terms of complex Hilbert space is derived for finite dimensions, starting from a general definition of physical experiment and from five simple Postulates concerning experimental accessibility and simplicity. For the infinite dimensional case, on the other hand, a C*‐algebra representation of physical transformations is derived, starting from just four of the five Postulates via a Gelfand‐Naimark‐Segal (GNS) construction. The present paper simplifies and sharpens the previous derivation in Ref. [1]. The main ingredient of the axiomatization is the postulated existence of faithful states that allows one to calibrate the experimental apparatus. Such notion is at the basis of the operational definitions of the scalar product and of the transposed of a physical transformation. What is new in the present paper with respect to Ref. [1], is the operational deduction of an involution corresponding to the complex‐conjugation for effects, whose extension to transformations allows to define the adjoint of a transformation when the extension is composition‐preserving. The existence of such composition‐preserving extension among possible extensions is analyzed.

40 citations


Proceedings ArticleDOI
TL;DR: In this paper, it is argued that all trace preserving completely positive maps, including unitary operators, should be regarded as subjective, in the same sense as a classical conditional probability distribution.
Abstract: Assuming that quantum states, including pure states, represent subjective degrees of belief rather than objective properties of systems, the question of what other elements of the quantum formalism must also be taken as subjective is addressed. In particular, we ask this of the dynamical aspects of the formalism, such as Hamiltonians and unitary operators. Whilst some operations, such as the update maps corresponding to a complete projective measurement, must be subjective, the situation is not so clear in other cases. Here, it is argued that all trace preserving completely positive maps, including unitary operators, should be regarded as subjective, in the same sense as a classical conditional probability distribution. The argument is based on a reworking of the Choi‐Jamiolkowski isomorphism in terms of “conditional” density operators and trace preserving completely positive maps, which mimics the relationship between conditional probabilities and stochastic maps in classical probability.

35 citations


Proceedings ArticleDOI
TL;DR: In this article, a picture of a physical world whose essence is "Darwinism all the way down" is drawn, and quantum theory should be viewed in light of that, i.e., as being an expression of probabilism (in Bruno de Finetti or Richard Jeffrey's sense) all theway back up.
Abstract: Once again, I take advantage of the wonderfully liberal and tolerant mood Andrei Khrennikov sets at his yearly conferences by submitting a nonstandard paper for the proceedings. This pseudo‐paper consists of excerpts drawn from two of my samizdats [Quantum States: What the Hell Are They? and Darwinism All the Way Down (and Probabilism All the Way Back Up)] that I think best summarize what I am aiming for on the broadest scale with my quantum foundations program. Section 1 tries to draw a picture of a physical world whose essence is “Darwinism all the way down.” Section 2 outlines how quantum theory should be viewed in light of that, i.e., as being an expression of probabilism (in Bruno de Finetti or Richard Jeffrey’s sense) all the way back up. Section 3 describes how the idea of “identical” quantum measurement outcomes, though sounding atomistic in character, nonetheless meshes well with a William Jamesian style “radical pluralism.” Sections 4 and 5 further detail how quantum theory should not be viewed so much as a “theory of the world,” but rather as a theory of decision‐making for agents immersed within a quantum world—that is, a world in continual creation. Finally, Sections 6 and 7 attempt to sketch once again the very positive sense in which quantum theory is incomplete, but still just as complete is it can be. In total, I hope these heady speculations convey some of the excitement and potential I see for the malleable world quantum mechanics hints of.

23 citations


Proceedings ArticleDOI
TL;DR: Weihs and Zeilinger as discussed by the authors described the experimental apparatus and methods used for a long-distance test of Bell's inequality, whose main results were published in G. Weihs, T. Jennewein, C. Simon, H.Weinfurter, and A.Zeilinger.
Abstract: This is a description of the experimental apparatus and methods used for a long‐distance test of Bell’s inequality, whose main results were published in G. Weihs, T. Jennewein, C. Simon, H.Weinfurter, and A. Zeilinger, Phys. Rev. Lett. 81, 5039 (1998). A brief discussion of recent attempts at analyzing our measurement data is included.

22 citations


Proceedings ArticleDOI
TL;DR: In this paper, a subjective survey of stochastic models of quantum mechanics is given along with a discussion of some key radiative processes, the clues they offer, and the difficulties they pose for this program.
Abstract: A subjective survey of stochastic models of quantum mechanics is given along with a discussion of some key radiative processes, the clues they offer, and the difficulties they pose for this program. An electromagnetic basis for deriving quantum mechanics is advocated, and various possibilities are considered. It is argued that only non‐local or non‐causal theories are likely to be a successful basis for such a derivation.

21 citations


Proceedings ArticleDOI
TL;DR: The rationale why sub(super)‐additive probabilities in a psychological setting could be explained via the use of quantum probability interference and why this complementarity cannot be found using the Heisenberg Uncertainty Principle is discussed.
Abstract: In this paper we discuss the rationale why sub(super)‐additive probabilities in a psychological setting could be explained via the use of quantum probability interference. We propose to measure the complementarity of two variables: i) time of processing (by experiment participants) of (non‐moving) images and ii) the ability (by experiment participants) of recognizing deformations of (non‐moving) pictures. We argue in the paper why we can not find this complementarity using the Heisenberg Uncertainty Principle. The paper provides for the details on the experimental set up to test the complementarity.

13 citations


Proceedings ArticleDOI
TL;DR: In this article, the authors argue that the Copenhagen interpretation of quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation, and that it is precisely the fact that the quantum state has an essentially logical significance.
Abstract: Einstein initially objected to the probabilistic aspect of quantum mechanics—the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that Einstein’s initial intuition was perfectly sound, and that it is precisely the fact that quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation. Probability is an intrinsically logical concept. This means that the quantum state has an essentially logical significance. It is extremely difficult to reconcile that fact with Einstein’s belief, that it is the task of physics to give us a vision of the world apprehended sub specie aeternitatis. Quantum mechanics thus presents us with a simple choice: either to follow Einstein in looking for a theory which is not probabilistic at the fundamental level, or else to accept that physics does not in fact put us in the position of God looking down on things from above. There is a widespread fear that the latter alternative must inevitably lead to a greatly impoverished, positivistic view of physical theory. It appears to us, however, that the truth is just the opposite. The Einsteinian vision is much less attractive than it seems at first sight. In particular, it is closely connected with philosophical reductionism.

10 citations


Proceedings ArticleDOI
TL;DR: In this paper, the authors combine four papers by Dirac to make the uncertainty relation consistent with the principle of Lorentz covariance, and show that the mathematics of two coupled oscillators enables them to carry out this job.
Abstract: The present form of quantum mechanics is based on the Copenhagen school of interpretation. Einstein did not belong to the Copenhagen school, because he did not believe in probabilistic interpretation of fundamental physical laws. This is the reason why we are still debating whether there is a more deterministic theory. One cause of this separation between Einstein and the Copenhagen school could have been that the Copenhagen physicists thoroughly ignored Einstein’s main concern: the principle of relativity. Paul A. M. Dirac was the first one to realize this problem. Indeed, from 1927 to 1963, Paul A. M. Dirac published at least four papers to study the problem of making the uncertainty relation consistent with Einstein’s Lorentz covariance. It is interesting to combine those papers by Dirac to make the uncertainty relation consistent with relativity. It is shown that the mathematics of two coupled oscillators enables us to carry out this job. We are then led to the question of whether the concept of localized probability distribution is consistent with Lorentz covariance.

7 citations


Proceedings ArticleDOI
TL;DR: In this paper, the Consistent Amplitude approach to quantum theory is reviewed and it is argued that quantum probabilities are explicitly Bayesian, and that amplitudes are tools for inference.
Abstract: We review the Consistent Amplitude approach to Quantum Theory and argue that quantum probabilities are explicitly Bayesian. In this approach amplitudes are tools for inference. They codify objective information about how complicated experimental setups are put together from simpler ones. Thus, probabilities may be partially subjective but the amplitudes are not.

Proceedings ArticleDOI
TL;DR: In this paper, it has been suggested that negative probabilities require the existence of unsuspected correlations between detection events, and the authors evaluate this claim in light of several representative experiments and assess its implications on the data.
Abstract: Negative probabilities emerged at intermediate steps in various attempts to predict the distributions of quantum interference. There is no consensus on their meaning yet. It has been suggested (Khrennikov, 1998) that negative probabilities require the existence of unsuspected correlations between detection events. We evaluate this claim in light of several representative experiments. In our assessment, some of its implications are in good agreement with the data.

Proceedings ArticleDOI
TL;DR: In this paper, the infinite Haar measure on a current group was considered as an invariant distribution associated to the Wiener process on the current group, and it was shown that the invariance of Haar measures on a group can be expressed as
Abstract: We consider the infinite Haar measure on a current group as an invariant distribution associated to the Wiener process on the current group.


Proceedings ArticleDOI
TL;DR: In this paper, the authors formalized the rules of classical→ quantum correspondence and performed a rigorous mathematical analysis of the assumptions in Bell's NOGO arguments, dedicated to the memory of Walter Philipp.
Abstract: In this paper dedicated to the memory of Walter Philipp, we formalize the rules of classical→ quantum correspondence and perform a rigorous mathematical analysis of the assumptions in Bell’s NO‐GO arguments.

Proceedings ArticleDOI
TL;DR: In this article, a detailed analysis of assumptions that J. Bell used to show that local realism contradicts QM is presented, and it is shown that Bell's viewpoint on realism is nonphysical, because it implicitly assumes that observed physical variables coincides with ontic variables (i.e., these variables before measurement).
Abstract: We present a detailed analysis of assumptions that J. Bell used to show that local realism contradicts QM. We find that Bell’s viewpoint on realism is nonphysical, because it implicitly assume that observed physical variables coincides with ontic variables (i.e., these variables before measurement). The real physical process of measurement is a process of dynamical interaction between a system and a measurement device. Therefore one should check the adequacy of QM not to “Bell’s realism,” but to adaptive realism (chameleon realism). Dropping Bell’s assumption we are able to construct a natural representation of the EPR-Bohm correlations in the local (adaptive) realistic approach.

Proceedings ArticleDOI
TL;DR: In this paper, the same Hurts exponents in the range (0.7 -0.8) in Bach's sinfonias and invented and sinfoni were found in the frequency series of Bach's pitches.
Abstract: Detrended Fluctuation Analysis (DFA), suitable for the analysis of nonstationary time series, is used to investigate power law in some of the Bach’s pitches series. Using DFA method, which also is a well‐established method for the detection of long‐range correlations, frequency series of Bach’s pitches have been analyzed. In this view we find same Hurts exponents in the range (0.7 – 0.8) in his Inventions and sinfonias.

Proceedings ArticleDOI
TL;DR: In this article, the consequences of the thermal and quantum fluctuations of classically non-ideal measurement apparatuses explicitly are described. But the consequences do not affect the results of later measurements, in contrast to ideal quantum measurements.
Abstract: A quantum field model for an experiment describes thermal fluctuations explicitly and quantum fluctuations implicitly, whereas a comparable continuous random field model would describe both thermal and quantum fluctuations explicitly An ideal classical measurement does not affect the results of later measurements, in contrast to ideal quantum measurements, but we can describe the consequences of the thermal and quantum fluctuations of classically non‐ideal measurement apparatuses explicitly Some details of continuous random fields and of Bell inequalities for random fields will be discussed

Proceedings ArticleDOI
TL;DR: In this paper, a trivial probabilistic illustration for representation of quantum mechanics as an algorithm for approximative calculation of averages is presented, and the algorithm can be used to represent any quantum system.
Abstract: We present a trivial probabilistic illustration for representation of quantum mechanics as an algorithm for approximative calculation of averages.

Proceedings ArticleDOI
TL;DR: In this paper, concrete examples for frame functions and their associated density operators, as well as for non-Gleason type probability measures, are discussed, and a discussion of the relationship between density and frame functions is provided.
Abstract: We discuss concrete examples for frame functions and their associated density operators, as well as for non‐Gleason type probability measures.

Proceedings ArticleDOI
TL;DR: In this article, it is shown that such a set P preserves many natural probabilistic properties, and some nontrivial examples of such sets can be found in the literature.
Abstract: probabilities as elements of some probabilistic set P are reduced. The set P is defined by some axioms, which are analogy to the properties of the real numbers [0, 1]. There is shown that such set P preserves many natural probabilistic properties. There are some nontrivial examples of such probabilistic sets.

Proceedings ArticleDOI
TL;DR: In this paper, the difference between the Bayesian and frequentist approaches to hypothesis testing is reviewed and a simple example in quantum tomography for which Bayesian or frequentist predictions differ is presented.
Abstract: This short paper is a reply to Andrei Khrennikov’s challenge to give a concrete quantum‐mechanical example where the Bayesian and frequentist approaches to probability lead to different predictions. The paper first reviews the difference between the Bayesian and frequentist approaches to hypothesis testing and then presents a simple example in quantum tomography for which Bayesian and frequentist predictions differ.

Proceedings ArticleDOI
TL;DR: This paper is a presentation of how to implement quantum algorithms (namely, Shor’s algorithm ) on a classical computer by using the well‐known Mathematica package to give a lucid connection between mathematical formulation of quantum mechanics and computational methods.
Abstract: This paper is a presentation of how to implement quantum algorithms (namely, Shor’s algorithm ) on a classical computer by using the well‐known Mathematica package. It will give us a lucid connection between mathematical formulation of quantum mechanics and computational methods.

Proceedings ArticleDOI
TL;DR: In this paper, the authors considered the radiative transfer equation as the potential equation of a classical Markov process, and the transition probability of this process was analyzed by considering the scattering of a photon by an atom and then taking the limits of geometrical optics and of singular coupling.
Abstract: The radiative transfer equation can be regarded as the potential equation of a classical Markov process. The transition probability of this Markov process can be understood by considering the scattering of a photon by an atom and then taking the limits of geometrical optics and of singular coupling.

Proceedings ArticleDOI
TL;DR: This work performs geometrization of genetics by representing genetic information by points of the 4‐adic information space by well known theorem of number theory this space can also be represented as the 2‐adic space.
Abstract: We perform geometrization of genetics by representing genetic information by points of the 4‐adic information space. By well known theorem of number theory this space can also be represented as the 2‐adic space. The process of DNA‐reproduction is described by the action of a 4‐adic (or equivalently 2‐adic) dynamical system. As we know, the genes contain information for production of proteins. The genetic code is a degenerate map of codons to proteins. We model this map as functioning of a polynomial dynamical system. The purely mathematical problem under consideration is to find a dynamical system reproducing the degenerate structure of the genetic code. We present one of possible solutions of this problem..

Proceedings ArticleDOI
TL;DR: In this article, the authors investigated general construction methods of OVMs in terms of geometric positive trace increasing maps (PTI), for general 1D domains, as well as 2D shapes e.g. circles, disks.
Abstract: Probability measures (quasi probability mass), given in the form of integrals of Wigner function over areas of the underlying phase space, give rise to operator valued probability measures (OVM). General construction methods of OVMs, are investigated in terms of geometric positive trace increasing maps (PTI), for general 1D domains, as well as 2D shapes e.g. circles, disks. Spectral properties of OVMs and operational implementations of their constructing PITs are discussed.

Proceedings ArticleDOI
TL;DR: In this article, a noncommutative construction for the two parameter case, these being integrals in the plane, resulting in type one and type two stochastic integrals which are orthogonal, centred L2 - martingales, obeying isometry properties was developed to obtain an Ito‐Clifford Wong-Zakai martingale representation.
Abstract: Classical stochastic integration is based upon a probability space involving a filtration of sigma‐algebras. This construction lends itself to non‐commutative quantum analogues based for example, on a Hilbert space, a filtration of von Neumann algebras and gage. We recall a non‐commutative construction for the two parameter case, these being integrals in the plane, resulting in type one and type two stochastic integrals which are orthogonal, centred L2 — martingales, obeying isometry properties and develop the construction to obtain an Ito‐Clifford Wong‐Zakai martingale representation.

Proceedings ArticleDOI
TL;DR: For the Kerr-Newman black hole, it was pointed out in this article that the active gravitational mass equals twice the inertial mass, and this remains valid when the solution is applied as a model for the electron.
Abstract: For the Kerr‐Newman black hole, it is pointed out that the active gravitational mass equals twice the inertial mass. This remains valid when the solution is applied as a model for the electron. The factor 2 also occurs in a classical electrodynamics model for the electron proposed recently, where gravity is negligible. On the basis of this, one may assume that for all matter the active gravitational mass equals twice the inertial mass. Newtonian physics will remain unchanged provided the gravitational constant is reduced by a factor 2. In cosmology, effects will appear.

Proceedings ArticleDOI
TL;DR: In this article, a special stochastic realization of the wave function in quantum mechanics, with the inclusion of soliton representation of extended particles within the scope of nonlinear spinor field model, is considered.
Abstract: Special stochastic realization of the wave function in quantum mechanics (QM), with the inclusion of soliton representation of extended particles within the scope of nonlinear spinor field model, is considered. Two‐solitons configurations are used for constructing entangled states in generalized QM dealing with extended 1/2‐spin particles. Entangled solitons construction is used for calculating the Einstein—Podolsky—Rosen (EPR) spin correlation, which is shown to coincide with the quantum mechanical one for the two 1/2‐spin particles in the singlet state.

Proceedings ArticleDOI
TL;DR: A theory where (b) is relaxed, although in principle allowing for measurements of a more general type, cannot be experimentally falsified within the current experimental paradigm.
Abstract: In order to prove equivalence of quantum mechanics with nonlocal hidden‐variable theories of a Bohm type one assumes that all the possible measurements belong to a restricted class: (a) we measure only positions of particles and (b) have no access to exact values of initial conditions for Bohm’s trajectories. However, in any computer simulation based on Bohm’s equations one relaxes the assumption (b) and yet obtains agreement with quantum predictions concerning the results of positional measurements. Therefore a theory where (b) is relaxed, although in principle allowing for measurements of a more general type, cannot be experimentally falsified within the current experimental paradigm. Such generalized measurements have not been invented, or have been invented but the information is qualified, but we cannot exclude their possibility on the basis of known experimental data. Since the measurements would simultaneously allow for eavesdropping in standard quantum cryptosystems, the arguments for security of quantum cryptography become logically circular: Bohm‐type theories do not allow for eavesdropping because they are fully equivalent to quantum mechanics, but the equivalence follows from the assumption that we cannot measure hidden variables, which would be equivalent to the possibility of eavesdropping… Here we break the vicious circle by a simple modification of entangled‐state protocols that makes them secure even if our enemies have more imagination and know how to measure hidden‐variable initial conditions with arbitrary precision.