scispace - formally typeset
Search or ask a question

Showing papers in "Foundations of probability and physics in 2009"


Proceedings ArticleDOI
TL;DR: In this paper, the interference patterns produced by three slits and all the possible combinations of those slits being open or closed were measured using an attenuated laser light combined with single photon counting.
Abstract: In Mod Phys Lett A 9, 3119 (1994), one of us (RDS) investigated a formulation of quantum mechanics as a generalized measure theory Quantum mechanics computes probabilities from the absolute squares of complex amplitudes, and the resulting interference violates the (Kolmogorov) sum rule expressing the additivity of probabilities of mutually exclusive events However, there is a higher order sum rule that quantum mechanics does obey, involving the probabilities of three mutually exclusive possibilities We could imagine a yet more general theory by assuming that it violates the next higher sum rule In this paper, we report results from an ongoing experiment which sets out to test the validity of this second sum rule by measuring the interference patterns produced by three slits and all the possible combinations of those slits being open or closed We use an attenuated laser light combined with single photon counting to confirm the particle character of the measured light

53 citations


Proceedings ArticleDOI
TL;DR: In this article, it is explained on a physical basis how absence of contextuality allows Bell inequalities to be violated without bringing an implication on locality or realism, and it is shown that even if Bell Inequality Violation is demonstrated beyond reasonable doubt, it will have no say on local realism.
Abstract: It is explained on a physical basis how absence of contextuality allows Bell inequalities to be violated, without bringing an implication on locality or realism. Hereto we connect first to the local realistic theory Stochastic Electrodynamics, and then put the argument more broadly. Thus even if Bell Inequality Violation is demonstrated beyond reasonable doubt, it will have no say on local realism.

52 citations


Proceedings ArticleDOI
TL;DR: In this article, the Weyl-Heisenberg covariant SIC-POVMs and full sets of MUBs in prime dimension were shown to form a d2−1 dimensional regular simplex, where d being the Hilbert space dimension.
Abstract: The paper concerns Weyl‐Heisenberg covariant SIC‐POVMs (symmetric informationally complete positive operator valued measures) and full sets of MUBs (mutually unbiased bases) in prime dimension. When represented as vectors in generalized Bloch space a SIC‐POVM forms a d2−1 dimensional regular simplex (d being the Hilbert space dimension). By contrast, the generalized Bloch vectors representing a full set of MUBs form d+1 mutually orthogonal d−1 dimensional regular simplices. In this paper we show that, in the Weyl‐Heisenberg case, there are some simple geometrical relationships between the single SIC‐POVM simplex and the d+1 MUB simplices. We go on to give geometrical interpretations of the minimum uncertainty states introduced by Wootters and Sussman, and by Appleby, Dang and Fuchs, and of the fiduciality condition given by Appleby, Dang and Fuchs.

51 citations


Proceedings ArticleDOI
TL;DR: In this paper, the authors show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR•Bohm•Bell experiment.
Abstract: We show that paradoxical consequences of violations of Bell’s inequality are induced by the use of an unsuitable probabilistic description for the EPR‐Bohm‐Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

31 citations


Proceedings ArticleDOI
TL;DR: In this article, Niestegge developed a new approach to quantum mechanics via conditional probabilities and applied it to the treatment of concepts using a geometrical model of meaning, where instances are treated as vectors of a Hilbert space H. In this model there are at least two possibilities to form categories.
Abstract: Recently, Gerd Niestegge developed a new approach to quantum mechanics via conditional probabilities developing the well‐known proposal to consider the Luders‐von Neumann measurement as a non‐classical extension of probability conditionalization. I will apply his powerful and rigorous approach to the treatment of concepts using a geometrical model of meaning. In this model, instances are treated as vectors of a Hilbert space H. In the present approach there are at least two possibilities to form categories. The first possibility sees categories as a mixture of its instances (described by a density matrix). In the simplest case we get the classical probability theory including the Bayesian formula. The second possibility sees categories formed by a distinctive prototype which is the superposition of the (weighted) instances. The construction of prototypes can be seen as transferring a mixed quantum state into a pure quantum state freezing the probabilistic characteristics of the superposed instances into t...

31 citations


Proceedings ArticleDOI
TL;DR: In this article, a condition of probabilistic incompatibility of random variables, i.e., inability to realize them on a single probability space, is introduced, referred to as non-Kolmogorovness.
Abstract: In this paper we would like to stress that, besides two commonly discussed conditions inducing violation of Bell’s inequality—nonlocality and death of realism—there is the third condition having the same consequence. This is the condition of probabilistic incompatibility (PI) of random variables—impossibility to realize them on a single probability space—“non‐Kolmogorovness.” This additional source of violation of Bell’s inequality should be taken into account. We remark that PI can be a consequence of nonlocality or impossibility to use the realistic model. However, PI is essentially more general condition. Random variables can be of the PI‐type even in absence of nonolocal effects.

28 citations


Proceedings ArticleDOI
TL;DR: In this article, the authors show that the combined effect of these thresholds can lead to a significant sampling selection bias in the detection of pairs of pulses, resulting in an apparent violation of Bell inequalities.
Abstract: Photomultiplier tubes and avalanche photodiodes, which are commonly used in quantum optic experiments, are sometimes referred to as threshold detectors because, in photon counting mode, they cannot discriminate the number of photoelectrons initially extracted from the absorber in the detector. We argue that they can be called threshold detectors on more account than that. We point out that their their functioning principle relies on two thresholds that are usually thought unimportant individually in the context of EPR‐Bell discussion. We show how the combined effect of these threshold can lead to a significant sampling selection bias in the detection of pairs of pulses, resulting in an apparent violation of Bell inequalities.

26 citations


Proceedings ArticleDOI
TL;DR: In this paper, the authors show that the prior remains important even in the limit of an infinite number of measurements, and illustrate this point with several examples where two priors lead to different conclusions given the same measurement data.
Abstract: In quantum Bayesian inference problems, any conclusions drawn from a finite number of measurements depend not only on the outcomes of the measurements but also on a prior. Here we show that, in general, the prior remains important even in the limit of an infinite number of measurements. We illustrate this point with several examples where two priors lead to very different conclusions given the same measurement data.

19 citations


Proceedings ArticleDOI
TL;DR: In this article, it was shown that the assumption of a joint distribution for incompatible observables and the probability structure of quantum-mechanics is not expected to be relevant to quantum phenomena described by noncommuting observables, irrespective of the issue of locality.
Abstract: Bell’s theorem is a statement by which averages obtained from specific types of statistical distributions must conform to a family of inequalities. These models, in accordance with the EPR argument, provide for the simultaneous existence of quantum mechanically incompatible quantities. We first recall several contradictions arising between the assumption of a joint distribution for incompatible observables and the probability structure of quantum‐mechanics, and conclude that Bell’s theorem is not expected to be relevant to quantum phenomena described by non‐commuting observables, irrespective of the issue of locality. Then, we try to disentangle the locality issue from the existence of joint distributions by introducing two models accounting for the EPR correlations but denying the existence of joint distributions. We will see that these models do not need to resort explicitly to non‐locality: the first model relies on conservation laws for ensembles, and the second model on an equivalence class by which different configurations lead to the same physical predictions.

9 citations


Proceedings ArticleDOI
TL;DR: In this paper, the Born Rule is viewed as a normative rule beyond Dutchbook coherence, which takes into account how one should assign probabilities to the consequences of various intended interactions with a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference interaction.
Abstract: In the quantum Bayesian understanding of quantum states being developed by the authors and collaborators, the Born Rule cannot be interpreted as a rule for setting measurement‐outcome probabilities from an objective quantum state But if not, what is the role of the rule? In this paper, we argue that it should be seen as an empirical addition to Bayesian reasoning itself Particularly, we show how to view the Born Rule as a normative rule beyond Dutch‐book coherence, which takes into account how one should assign probabilities to the consequences of various intended interactions with a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference interaction This interpretation is seen particularly clearly by representing quantum states in terms of probabilities for the outcomes of a fixed, fiducial symmetric informationally complete (SIC) measurement We further explore the extent to which the

8 citations


Proceedings ArticleDOI
TL;DR: In this article, the authors use the following epistemology to explore the roots of statistical nature of the real world, including classical physics, quantum physics and even our mental constructs, and show that gaps in the information gathered about any phenomenon is inevitable.
Abstract: We use the following epistemology—understanding and visualizing the invisible processes behind all natural phenomena through iterative reconstruction and/or refinement of current working theories towards their limits, constitute our best approach towards discovering actual realities of nature followed by new break‐through theories. We use this epistemology to explore the roots of statistical nature of the real world—classical physics, quantum physics and even our mental constructs. Diversity is a natural and healthy outcome of this statistical nature. First, we use a two‐beam superposition experiment as an illustrative example of the quantum world to visualize the root of fluctuations (or randomness) in the photo electron counting statistics. We recognize that the fluctuating weak background fields make the quantum world inherently random but the fluctuations are still statistically bounded, indicating that the fundamental laws of nature are still causal. Theoreticians will be challenged for ever to construct a causal and closed form theory free of statistical randomness out of incomplete information. We show by analyzing the essential steps behind any experiment that gaps in the information gathered about any phenomenon is inevitable. This lack of information also influences our personal epistemologies to have “statistical spread” due to its molecular origin, albeit bounded and constrained by the causally driven atomic and molecular interactions across the board. While there are clear differences in the root and manifestation of classical and quantum statistical behavior, on a fundamental level they originate in our theories due to lack of complete information about everything that is involved in every interaction in our experiments. Statistical nature of our theories is a product of incomplete information and we should take it as an inevitable paradigm.

Proceedings ArticleDOI
TL;DR: In this paper, a review of photon number tomography and symplectic tomography as examples of star product quantization is presented, and the classical statistical mechanics are considered within the framework of the tomographic representation.
Abstract: A review of the photon‐number tomography and symplectic tomography as examples of star‐product quantization is presented. The classical statistical mechanics is considered within the framework of the tomographic representation.

Proceedings ArticleDOI
TL;DR: In this paper, a comparison of the Ahronov-Bohm effects in the two-slit interference experiment and in superconductor ring reveals fundamental difference between the Schrodinger wave function and the wave function describing macroscopic quantum phenomena.
Abstract: The Bohm’s quantum potential, introduced in 1952, and the quantum force in superconductor, introduced in 2001, allow to describe non‐local force‐free momentum transfer observed in the Ahronov‐Bohm effects. Comparison of the Ahronov‐Bohm effects in the two‐slit interference experiment and in superconductor ring reveals fundamental difference between the Schrodinger wave function and the wave function describing macroscopic quantum phenomena. The Ginzburg‐Landau wave function describing the superconductivity phenomenon can not collapse and an additional postulate, which was implied first by L.D. Landau, must be used for the description of macroscopic quantum phenomena. It is note that quantum principles and postulates should not be universal till the quantum formalism is only phenomenological theory but no description of an unique reality.

Proceedings ArticleDOI
TL;DR: The main claim of as discussed by the authors is that physical reality may be regarded as non-well-founded in the framework of probabilities distributed on non-archimedean ordering structures, in particular, distributed on padic numbers.
Abstract: Classical mechanics implicitly use Archimedes’ axiom, according to that everything may be measured by a rigid scale. Uncertainty in quantum mechanics shows the limits of applying Archimedes’ axiom. The negation of Archimedes’ axiom is derivative from the negation of the set‐theoretic axiom of foundation. The latter postulates that the set‐membership relation is well‐founded: for every set there exists no infinitely descending chain. Denying the foundation axiom in number systems implies setting a non‐Archimedean ordering structure. The main claim of our paper is that physical reality may be regarded as non‐well‐founded in the framework of probabilities distributed on non‐Archimedean ordering structures, in particular, distributed on p‐adic numbers.

Proceedings ArticleDOI
TL;DR: In this article, the authors examine the view of quantum mechanics that emerged shortly after the introduction of Quantum mechanics and that has been widespread ever since, and offer an alternative, non-causal, view of the quantum-mechanical situation and consider the differences between the ensemble and the Bayesian approaches to quantum mechanics from this perspective.
Abstract: This paper critically examines the view of quantum mechanics that emerged shortly after the introduction of quantum mechanics and that has been widespread ever since. Although N. Bohr, P. A. M. Dirac, and W. Heisenberg advanced this view earlier, it is best exemplified by J. von Neumann’s argument in Mathematical Foundations of Quantum Mechanics (1932) that the transformation of “a [quantum] state … under the action of an energy operator … is purely causal,” while, “on the other hand, the state … which may measure a [given] quantity … undergoes in a measurement a non‐casual change.” Accordingly, while the paper discusses all four of these arguments, it will especially focus on that of von Neumann. The paper also offers an alternative, noncausal, view of the quantum‐mechanical situation and considers the differences between the ensemble and the Bayesian approaches to quantum mechanics from this perspective.

Proceedings ArticleDOI
TL;DR: The extended semantic realism (ESR) model as discussed by the authors was proposed by the authors of this paper and is based on the ESR model, which is a non-contextual framework and reinterprets quantum probabilities as conditional instead of absolute.
Abstract: The extended semantic realism (ESR) model, recently worked out by the author, embodies the mathematical formalism of standard (Hilbert space) quantum mechanics (QM) in a noncontextual framework and reinterprets quantum probabilities as conditional instead of absolute The predictions of the ESR model do not coincide with the predictions of QM, which makes the model falsifiable In particular, the ESR model predicts that, whenever idealized measurements are performed, modified Bell–Clauser–Horne–Shimony–Holt (BCHSH) inequalities hold if one takes into account all individual systems that are prepared, standard quantum inequalities hold if one considers only the individual systems that are detected, and standard BCHSH inequalities hold at a microscopic (purely theoretical) level These results constitute a first example of the unified perspective that can be attained by adopting the ESR model

Proceedings ArticleDOI
TL;DR: In this paper, the Bell-Kochen-Specker theorem is criticised from an epistemological point of view, showing that its proofs rest on an implicit assumption which does not fit in with the operational and antimetaphysical attitude of orthodox quantum mechanics.
Abstract: The Bell–Kochen–Specker theorem is criticized from an epistemological point of view, showing that its proofs rest on an implicit epistemological assumption which does not fit in with the operational and antimetaphysical attitude of orthodox quantum mechanics.

Proceedings ArticleDOI
TL;DR: In this paper, a new trigonometry for twistors is presented and the operator-theoretic maximum twistor turning angle is related to the space-time geometric angle within the light cone.
Abstract: A new trigonometry for twistors is presented. The operator‐theoretic maximum twistor turning angle is shown to be related to the space‐time geometric angle within the light cone. The corresponding maximally turned twistor antieigenvectors are calculated and interpretted. The two weak interaction CP eigenvectors of neutral kaons are shown to be exactly the two strong interaction strangeness antieigenvectors. Quark mixing is seen trigonometrically. ’t Hooft’s microcosmos model is connected to the theories of normal degree and complex dynamics.

Proceedings ArticleDOI
TL;DR: The presence of Introns in the RNA‐Crypto System output will be used as a strong method to add chaotic non coding information and an unnecessary behavior in the access to the secret key to code the messages.
Abstract: The RNA‐Crypto System (shortly RCS) is a symmetric key algorithm to cipher data.The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties.The RNA sequences have some sections called Introns. Introns, derived from the term “intragenic regions,” are non‐coding sections of precursor mRNA (pre‐mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre‐mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well.The nature and the role of Introns in the pre‐mRNA is not clear and it is under ponderous researches by biologists but, in our case, we will use the presence of Introns in the RNA‐Crypto System output as a strong method to add chaotic non coding information and an unnecessary behavior in the access to the secret key to code the messages.In the RNA‐Crypt...

Proceedings ArticleDOI
TL;DR: In this paper, a new definition of entanglement based upon the split of an observable is introduced, and the measure of entagglement connected with this definition is discussed, together with a measure of entropy.
Abstract: A new definition of entanglement based upon the split of an observable is introduced. The measure of entanglement connected with this definition is discussed.

Proceedings ArticleDOI
TL;DR: The authors review a paper by Klyachko, Can, Binicioglu, and Shumovsky, and explain a little of the background as I see it, and give a good summary of their work.
Abstract: I review a paper by Klyachko, Can, Binicioglu, and Shumovsky, and explain a little of the background as I see it.

Proceedings ArticleDOI
TL;DR: In this article, the main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, deterministic and exact theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR•chameleon experiment.
Abstract: This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, “deterministic” and “exact” theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR‐chameleon experiment.

Proceedings ArticleDOI
TL;DR: This article showed that Newton's physics is not a starting point in Feynman's derivation, neither is quantum physics involved in it, but the foundations of relativity only, and applied a new method of analysis and interpretation of physics, named observational realism.
Abstract: A paper by Dyson, published nearly two decades ago, describing Feynman’s proof of Maxwell equations, has generated many comments, analysis, discussions and generalizations of the proof. Feynman’s derivation is assumed to be based on two main sets of equations. One is supposed to be the second law of Newton and the other a set of basic commutation relations from quantum physics.Here we present a new comment on this paper, focusing mainly on the initial arguments and applying a new method of analysis and interpretation of physics, named observational realism. The present discussion does not alter the technical steps of Feynman, but do clarify their basis. We show that Newton’s physics is not a starting point in Feynman’s derivation, neither is quantum physics involved in it, but the foundations of relativity only.

Proceedings ArticleDOI
TL;DR: In this paper, it is shown that the sum of the geometric and dynamical phases obtained in separate measurements is not equal to the associated total phase as obtained from a third measurement unless the system is in a pure state.
Abstract: In a neutron polarimetry experiment mixed neutron spin phases are determined. We consider evolutions leading to purely geometric, purely dynamical and combined phases. It is experimentally demonstrated that the sum of the geometric and dynamical phases—both obtained in separate measurements—is not equal to the associated total phase as obtained from a third measurement, unless the system is in a pure state. In this sense, mixed state phases are not additive.

Proceedings ArticleDOI
TL;DR: The assumption of noncontextuality is, in some scenarios, not only physically plausible, but unavoidable for any realistic noncontextual theory without instantaneous actions as discussed by the authors, which is the assumption that the correlations of compatible measurements are violated by any quantum state.
Abstract: The assumption of non‐contextuality is, in some scenarios, not only physically “plausible,” but unavoidable for any realistic theory without instantaneous actions Any realistic non‐contextual theory must satisfy some inequalities for the correlations of compatible (commeasurable) measurements, which are violated by any quantum state Actual experiments can reveal this state‐independent violation We discuss the requirements and the advantages of these experiments

Proceedings ArticleDOI
TL;DR: In this paper, it is shown that if one adds to the assumptions the principle or rotational symmetry of physical laws, a stronger version of the Bell's theorem emerges, and a new sufficient and necessary criterion for entanglement of general (mixed) states is presented.
Abstract: (A) Bell’s theorem rests on a conjunction of three assumptions: realism, locality and “free will.” A discussion of these assumptions will be presented. It will be also shown that, if one adds to the assumptions the principle or rotational symmetry of physical laws, a stronger version of the theorem emerges. (B) A link between Bell’s theorem and communication complexity problems will be presented. This also includes experimental realizations, which surprisingly do not involve entanglement. (C) A new sufficient and necessary criterion for entanglement of general (mixed) states will be presented. It is derived using the same geometric starting point as the inclusion of the symmetry in (A). The set of entanglement identifiers (EI’s) emerging via this method contains entanglement witnesses (EW’s), but they form only a subset of all EI’s. Thus the method is more powerful than he one based on EW’s.

Proceedings ArticleDOI
TL;DR: In this article, the authors considered Lorentz transform, Newton, Schrodinger and geodesic equations in a setting of arithmetic provided by Observer's Mathematics and provided results and communications pertaining to solution of these problems.
Abstract: This work considers Lorentz transform, Newton, Schrodinger and geodesic equations in a setting of arithmetic provided by Observer’s Mathematics. Certain results and communications pertaining to solution of these problems are provided.

Proceedings ArticleDOI
TL;DR: In this paper, the problem of projection in EPR experiments is considered from the point of view of a quantum mechanical theory of measurement, allowing a treatment of projection as a special case of conditional preparation.
Abstract: Notwithstanding it is well known that von Neumann’s projection postulate is inapplicable to most realistic measurement procedures, it keeps haunting the foundations of quantum mechanics. In particular its applicability to EPR experiments is often assumed. In the present contribution this problem is considered from the point of view of a quantum mechanical theory of measurement, allowing a treatment of projection in EPR experiments as a special case of conditional preparation. The conditions are spelled out under which the postulate may be applicable.

Proceedings ArticleDOI
TL;DR: The extended semantic realism (ESR) model as discussed by the authors is a non-contextual framework for quantum mechanics that reinterprets quantum probabilities as conditional instead of absolute, and provides a Hilbert space representation of the discrete generalized observables introduced by the ESR model.
Abstract: The extended semantic realism (ESR) model embodies the mathematical formalism of standard (Hilbert space) quantum mechanics (QM) in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here a Hilbert space representation of the discrete generalized observables introduced by the ESR model that satisfy a simple physical condition, propose a generalization of the projection postulate and suggest a possible mathematical description of an ideal measurement process in terms of evolution of the compound system made up of the measured system and the measuring apparatus.

Proceedings ArticleDOI
TL;DR: In this article, the authors used spontaneous downconversion, a 2m long cavity, and a fast optical switch to obtain the answer to the question: if the cavity mirror were replaced by a detector, could photons be counted immediately, or only after some time delay?
Abstract: In the process of parametric downconversion, a high‐energy photon is annihilated to create a pair of lower‐energy photons within a nonlinear dielectric material. It was shown some years ago that this process can be modified by quantum interference, by reflecting the high‐energy and low‐energy light back into the dielectric. The suppression or enhancement of the downconversion rate is analogous to the phenomenon of inhibited spontaneous emission that occurs when an excited atom is placed in a cavity whose allowed modes cannot support the spontaneous emission. In this case, the atom remains excited—but then, how does it interact with the cavity? If the cavity mirror were replaced by a detector, could photons be counted immediately, or only after some time delay? Our experiment uses spontaneous downconversion, a 2‐m long cavity, and a fast optical switch to obtain the answer.