scispace - formally typeset
Search or ask a question

Showing papers in "Mathematical Structures in Computer Science in 2006"


Journal ArticleDOI
TL;DR: A functional programming language for quantum computers by extending the simply-typed lambda calculus with quantum types and operations, and gives a type system using affine intuitionistic linear logic.
Abstract: In this paper we develop a functional programming language for quantum computers by extending the simply-typed lambda calculus with quantum types and operations. The design of this language adheres to the ‘quantum data, classical control’ paradigm, following the first author's work on quantum flow-charts. We define a call-by-value operational semantics, and give a type system using affine intuitionistic linear logic. The main results of this paper are the safety properties of the language and the development of a type inference algorithm.

175 citations


Journal ArticleDOI
TL;DR: The field of quantum programming languages is developing rapidly and there is a surprisingly large literature on the design of programming languages for quantum computations.
Abstract: The field of quantum programming languages is developing rapidly and there is a surprisingly large literature. Research in this area includes the design of programming languages for quantum computi...

170 citations


Journal ArticleDOI
TL;DR: A notion of predicate transformer and, in particular, the weakest precondition, appropriate for quantum computation, is developed and it is shown that there is a Stone-type duality between the usual state-transformer semantics and the weakest Precondition semantics.
Abstract: We develop a notion of predicate transformer and, in particular, the weakest precondition, appropriate for quantum computation. We show that there is a Stone-type duality between the usual state-transformer semantics and the weakest precondition semantics. Rather than trying to reduce quantum computation to probabilistic programming, we develop a notion that is directly taken from concepts used in quantum computation. The proof that weakest preconditions exist for completely positive maps follows immediately from the Kraus representation theorem. As an example, we give the semantics of Selinger's language in terms of our weakest preconditions. We also cover some specific situations and exhibit an interesting link with stabilisers.

149 citations


Journal ArticleDOI
TL;DR: The notion of indexed valuations is used to define a new monad that can be combined with the usual non-deterministic monad via a categorical distributive law and an equational characterisation of the construction is given.
Abstract: We study the combination of probability and non-determinism from a categorical point of view. In category theory, non-determinism and probability are represented by suitable monads. However, these two monads do not combine well as they are. To overcome this problem, we introduce the notion of indexed valuations. This notion is used to define a new monad that can be combined with the usual non-deterministic monad via a categorical distributive law. We give an equational characterisation of our construction. We discuss the computational meaning of indexed valuations, and we show how they can be used by giving a denotational semantics of a simple imperative language.

130 citations


Journal ArticleDOI
TL;DR: A dynamic logic formalism for reasoning about information flow in composite quantum systems, capable of expressing important features of quantum measurements and unitary evolutions of multi-partite states, as well as giving logical characterisations to various forms of entanglement.
Abstract: The main contribution of this paper is the introduction of a dynamic logic formalism for reasoning about information flow in composite quantum systems. This builds on our previous work on a complete quantum dynamic logic for single systems. Here we extend that work to a sound (but not necessarily complete) logic for composite systems, which brings together ideas from the quantum logic tradition with concepts from (dynamic) modal logic and from quantum computation. This Logic of Quantum Programs (LQP) is capable of expressing important features of quantum measurements and unitary evolutions of multi-partite states, as well as giving logical characterisations to various forms of entanglement (for example, the Bell states, the GHZ states etc.). We present a finitary syntax, a relational semantics and a sound proof system for this logic. As applications, we use our system to give formal correctness proofs for the Teleportation protocol and for a standard Quantum Secret Sharing protocol; a whole range of other quantum circuits and programs, including other well-known protocols (for example, superdense coding, entanglement swapping, logic-gate teleportation etc.), can be similarly verified using our logic.

107 citations


Journal ArticleDOI
TL;DR: This paper shows that by starting with ‘iterative algebras’, that is, algeBRas admitting a unique solution of all systems of flat recursive equations, a free iterative theory is obtained as the theory of free iteratives alge bras.
Abstract: Iterative theories, which were introduced by Calvin Elgot, formalise potentially infinite computations as unique solutions of recursive equations One of the main results of Elgot and his coauthors is a description of a free iterative theory as the theory of all rational trees Their algebraic proof of this fact is extremely complicated In our paper we show that by starting with ‘iterative algebras’, that is, algebras admitting a unique solution of all systems of flat recursive equations, a free iterative theory is obtained as the theory of free iterative algebras The (coalgebraic) proof we present is dramatically simpler than the original algebraic one Despite this, our result is much more general: we describe a free iterative theory on any finitary endofunctor of every locally presentable category $\cal{A}$Reportedly, a blow from the welterweight boxer Norman Selby, also known as Kid McCoy, left one victim proclaiming,‘It's the real McCoy!’

87 citations


Journal ArticleDOI
TL;DR: The proof that the bisimilarity based on graph rewriting with borrowed contexts is a congruence relation is introduced and compared with the derivation of labelled transitions via relative pushouts is compared.
Abstract: Motivated by recent work on the derivation of labelled transitions and bisimulation congruences from unlabelled reaction rules, we show how to address this problem in the DPO (double-pushout) approach to graph rewriting. Unlike the case with previous approaches, we consider graphs as objects, rather than arrows, of the category under consideration. This allows us to present a very simple way of deriving labelled transitions (called rewriting steps with borrowed context), which integrates smoothly with the DPO approach, has a very constructive nature and requires only a minimum of category theory. The core part of this paper is the proof that the bisimilarity based on graph rewriting with borrowed contexts is a congruence relation. We will also introduce some proof techniques and compare our approach with the derivation of labelled transitions via relative pushouts.

81 citations


Journal ArticleDOI
TL;DR: In this paper, the authors define a strongly normalising proof-net calculus corresponding to the logic of strongly compact closed categories with biproducts, which can be used to represent and reason about quantum processes.
Abstract: We define a strongly normalising proof-net calculus corresponding to the logic of strongly compact closed categories with biproducts. The calculus is a full and faithful representation of the free strongly compact closed category with biproducts on a given category with an involution. This syntax can be used to represent and reason about quantum processes.

72 citations


Journal ArticleDOI
TL;DR: This article generalises the notion of multiset used by Gamma to include rewrite rules, which become first-class citizens and builds a higher-order chemical programming language called HOCL.
Abstract: Gamma is a programming model in which computation can be seen as chemical reactions between data represented as molecules floating in a chemical solution. This model can be formalised as associative, commutative, conditional rewritings of multisets where rewrite rules and multisets represent chemical reactions and solutions, respectively. In this article we generalise the notion of multiset used by Gamma and present applications through various programming examples. First, multisets are generalised to include rewrite rules, which become first-class citizens. This extension is formalised by the $\gamma$-calculus, which is a chemical model that summarises in a few rules the essence of higher-order chemical programming. By extending the $\gamma$-calculus with constants, operators, types and expressive patterns, we build a higher-order chemical programming language called HOCL. Finally, multisets are further generalised by allowing elements to have infinite and negative multiplicities. Semantics, implementation and applications of this extension are considered.

68 citations


Journal ArticleDOI
TL;DR: A labelled transition system is derived for condition-event nets, corresponding to a natural notion of observable actions in Petri-net theory, and yields a congruential bisimilarity coinciding with one derived directly from the observable actions.
Abstract: A framework is defined within which reactive systems can be studied formally. The framework is based on s-categories, which are a new variety of categories within which reactive systems can be set up in such a way that labelled transition systems can be uniformly extracted. These lead in turn to behavioural preorders and equivalences, such as the failures preorder (treated elsewhere) and bisimilarity, which are guaranteed to be congruential. The theory rests on the notion of relative pushout, which was previously introduced by the authors.The framework is applied to a particular graphical model, known as link graphs, which encompasses a variety of calculi for mobile distributed processes. The specific theory of link graphs is developed. It is then applied to an established calculus, namely condition-event Petri nets.In particular, a labelled transition system is derived for condition-event nets, corresponding to a natural notion of observable actions in Petri-net theory. The transition system yields a congruential bisimilarity coinciding with one derived directly from the observable actions. This yields a calibration of the general theory of reactive systems and link graphs against known specific theories.

56 citations


Journal ArticleDOI
TL;DR: All continuous posets that are partially metrisable in their Scott topology are characterised and a natural hierarchy on the class of partially metrised posets is located, which is both necessary and sufficient for pmetrisability in terms of measurements, domain-theoretic bases and radially convex metrics.
Abstract: In this article, we characterise all continuous posets that are partially metrisable in their Scott topology. We present conditions for pmetrisability, which are both necessary and sufficient, in terms of measurements, domain-theoretic bases and, in a more general setting, in terms of radially convex metrics. These conditions, together with their refinements and generalisations, set a natural hierarchy on the class of partially metrised posets. We locate the class of countably-based continuous dcpos within this hierarchy.

Journal ArticleDOI
TL;DR: In this paper, a process algebraic notation for describing concurrent and distributed quantum computations and quantum communication protocols is defined. But it is not a congruence, since it does not preserve parallel composition.
Abstract: Full formal descriptions of algorithms making use of quantum principles must take into account both quantum and classical computing components, as well as communications between these components. Moreover, to model concurrent and distributed quantum computations and quantum communication protocols, communications over quantum channels that move qubits physically from one place to another must also be taken into account.Inspired by classical process algebras, which provide a framework for modelling cooperating computations, a process algebraic notation is defined. This notation provides a homogeneous style for formal descriptions of concurrent and distributed computations comprising both quantum and classical parts. Based upon an operational semantics that makes sure that quantum objects, operations and communications operate according to the postulates of quantum mechanics, an equivalence is defined among process states considered as having the same behaviour. This equivalence is a probabilistic branching bisimulation. From this relation, an equivalence on processes is defined. However, it is not a congruence because it is not preserved by parallel composition.

Journal ArticleDOI
TL;DR: This work ensures termination of a non-trivial subset of the $\pi$-calculus by a combination of conditions on types and on the syntax.
Abstract: A process $M$ terminates if it cannot produce an infinite sequence of reductions $M \mathop{\rightarrow}^{\tau} M_1\mathop{\rightarrow}^{\tau} M_2 \ldots$. Termination is a useful property in concurrency. For instance, a terminating applet, when loaded on a machine, will not run for ever, possibly absorbing all computing resources (a ‘denial of service’ attack). Similarly, termination guarantees that queries to a given service originate only finite computations.We ensure termination of a non-trivial subset of the $\pi$-calculus by a combination of conditions on types and on the syntax. The proof of termination is in two parts. The first uses the technique of logical relations – a well-know technique of $\lambda$-calculi – on a small set of non-deterministic ‘functional’ processes. The second part of the proof uses techniques of process calculi, in particular, techniques of behavioural preorders.

Journal ArticleDOI
TL;DR: This work uses a general method to transform some logically complex first-order formulae into a geometrical form and presents an example where the simplification was significant enough to suggest an improved version of a classical theorem.
Abstract: Recent work in constructive mathematics shows that Hilbert's program works for a large part of abstract algebra. Using in an essential way the ideas contained in the classical arguments, we can transform most of the highly abstract proofs of ‘concrete’ statements into elementary proofs. Surprisingly, the arguments we produce are not only elementary but also mathematically clearer, and not necessarily longer. We present an example where the simplification was significant enough to suggest an improved version of a classical theorem. For this we use a general method to transform some logically complex first-order formulae into a geometrical form, which may be interesting in itself.

Journal ArticleDOI
TL;DR: The syntax, operational semantics and type system of CQP are formally defined, and it is proved that the semantics preserves typing, and that typing guarantees that each qubit is owned by a unique process within a system.
Abstract: We define a language CQP (Communicating Quantum Processes) for modelling systems that combine quantum and classical communication and computation. CQP combines the communication primitives of the pi-calculus with primitives for measurement and transformation of the quantum state; in particular, quantum bits (qubits) can be transmitted from process to process along communication channels. CQP has a static type system, which classifies channels, distinguishes between quantum and classical data, and controls the use of quantum states. We formally define the syntax, operational semantics and type system of CQP, prove that the semantics preserves typing, and prove that typing guarantees that each qubit is owned by a unique process within a system. We also define a typechecking algorithm and prove that it is sound and complete with respect to the type system. We illustrate CQP by defining models of several quantum communication systems, and outline our plans for using CQP as the foundation for formal analysis and verification of combined quantum and classical systems.

Journal ArticleDOI
TL;DR: The Borel hierarchy of the class of context free languages accepted by Buchi 1-counter automata has been shown to be the same as the Wadge hierarchy of contextsafe languages with a Buchi or a Muller acceptance condition.
Abstract: We show that the Borel hierarchy of the class of context free $\omega$-languages, or even of the class of $\omega$-languages accepted by Buchi 1-counter automata, is the same as the Borel hierarchy of the class of $\omega$-languages accepted by Turing machines with a Buchi acceptance condition. In particular, for each recursive non-null ordinal $\alpha$, there exist some ${\bf \Sigma}^0_\alpha$-complete and some ${\bf \Pi}^0_\alpha$-complete $\omega$-languages accepted by Buchi 1-counter automata. And the supremum of the set of Borel ranks of context free $\omega$-languages is an ordinal $\gamma_2^1$ that is strictly greater than the first non-recursive ordinal $\omega_1^{\mathrm{CK}}$. We then extend this result, proving that the Wadge hierarchy of context free $\omega$-languages, or even of $\omega$-languages accepted by Buchi 1-counter automata, is the same as the Wadge hierarchy of $\omega$-languages accepted by Turing machines with a Buchi or a Muller acceptance condition.

Journal ArticleDOI
TL;DR: The model of quantum computation based on density matrices and superoperators can be decomposed into a pure classical (functional) part and an effectful part modelling probabilities and measurement and expressed in the Haskell programming language using its special syntax for arrow computations.
Abstract: We show that the model of quantum computation based on density matrices and superoperators can be decomposed into a pure classical (functional) part and an effectful part modelling probabilities and measurement. The effectful part can be modelled using a generalisation of monads called arrows. We express the resulting executable model of quantum computing in the Haskell programming language using its special syntax for arrow computations. However, the embedding in Haskell is not perfect: a faithful model of quantum computing requires type capabilities that are not directly expressible in Haskell.

Journal ArticleDOI
TL;DR: Modified bar recursion, a higher type recursion scheme, which has been used in Berardi et al. (1998) and Berger and Oliva (2005) for a realisability interpretation of classical analysis is studied.
Abstract: This paper studies modified bar recursion, a higher type recursion scheme, which has been used in Berardi et al. (1998) and Berger and Oliva (2005) for a realisability interpretation of classical analysis. A complete clarification of its relation to Spector's and Kohlenbach's bar recursion, the fan functional, Gandy's functional $\Gamma$ and Kleene's notion of S1–S9 computability is given.

Journal ArticleDOI
TL;DR: This work introduces a Classically controlled Quantum Turing Machine (CQTM), which is a Turing machine with a quantum tape for acting on quantum data, and a classical transition function for formalised classical control, and proves that any classical Turing machine can be simulated by a CQTM without loss of efficiency.
Abstract: It is reasonable to assume that quantum computations take place under the control of the classical world. For modelling this standard situation, we introduce a Classically controlled Quantum Turing Machine (CQTM), which is a Turing machine with a quantum tape for acting on quantum data, and a classical transition function for formalised classical control. In a CQTM, unitary transformations and quantum measurements are allowed. We show that any classical Turing machine can be simulated by a CQTM without loss of efficiency. Furthermore, we show that any $k$-tape CQTM can be simulated by a 2-tape CQTM with a quadratic loss of efficiency. In order to compare CQTMs with existing models of quantum computation, we prove that any uniform family of quantum circuits (Yao 1993) is efficiently approximated by a CQTM. Moreover, we prove that any semi-uniform family of quantum circuits (Nishimura and Ozawa 2002), and any measurement calculus pattern (Danos et al. 2004) are efficiently simulated by a CQTM. Finally, we introduce a Measurement-based Quantum Turing Machine (MQTM), which is a restriction of CQTMs in which only projective measurements are allowed. We prove that any CQTM is efficiently simulated by a MQTM. In order to appreciate the similarity between programming classical Turing machines and programming CQTMs, some examples of CQTMs are given.

Journal ArticleDOI
TL;DR: It is shown that second order leads to polytime unsoundness, and simple constraints on second-order quantification and fixpoints are introduced, and it is proved that the fragments obtained are polytime sound and complete.
Abstract: Light affine logic is a variant of linear logic with a polynomial cut-elimination procedure. We study the extensional expressive power of light affine logic with respect to a general notion of encoding of functions in the setting of the Curry–Howard correspondence. We consider light affine logic with both fixpoints of formulae and second-order quantifiers, and analyse the properties of polytime soundness and polytime completeness for various fragments of this system. In particular, we show that the implicative propositional fragment is not polytime complete if we place some reasonable conditions on the encodings. Following previous work, we show that second order leads to polytime unsoundness. We then introduce simple constraints on second-order quantification and fixpoints, and prove that the fragments obtained are polytime sound and complete.

Journal ArticleDOI
TL;DR: Compactly generated monotone convergence spaces are proposed as a well-behaved topological generalisation of directed-complete partial orders (dcpos) and standard domain-theoretic constructions of products and function spaces on dcpos are compared, showing that these agree in important cases, though not in general.
Abstract: We propose compactly generated monotone convergence spaces as a well-behaved topological generalisation of directed-complete partial orders (dcpos). The category of such spaces enjoys the usual properties of categories of ‘predomains’ in denotational semantics. Moreover, such properties are retained if one restricts to spaces with a countable pseudobase in the sense of E. Michael, a fact that permits connections to be made with computability theory, realizability semantics and recent work on the closure properties of topological quotients of countably based spaces (qcb spaces). We compare the standard domain-theoretic constructions of products and function spaces on dcpos with their compactly generated counterparts, showing that these agree in important cases, though not in general.

Journal ArticleDOI
TL;DR: An approach to point-free geometry based on the notion of a quasi-metric is proposed in which the primitives are the regions and a non-symmetric distance between regions.
Abstract: An approach to point-free geometry based on the notion of a quasi-metric is proposed in which the primitives are the regions and a non-symmetric distance between regions. The intended models are the bounded regular closed subsets of a metric space together with the Hausdorff excess measure.

Journal ArticleDOI
TL;DR: A domain-theoretic analogue of the classical Banach–Alaoglu theorem, showing that the patch topology on the weak$*$ topology is compact, is given, in particular, that the ‘sandwich set’ of linear functionals is compact.
Abstract: We give a domain-theoretic analogue of the classical Banach–Alaoglu theorem, showing that the patch topology on the weak$*$ topology is compact. Various theorems follow concerning the stable compactness of spaces of valuations on a topological space. We conclude with reformulations of the patch topology in terms of polar sets or Minkowski functionals, showing, in particular, that the ‘sandwich set’ of linear functionals is compact.

Journal ArticleDOI
TL;DR: Some evidence is given that, for all uniform classes of proper $\lambda$-models living in functional semantics, $\lambda \mathcal{C}-\lambda \ Mathcal {C}^{\prime}$ should have cardinality $2^{\omega }$.
Abstract: This paper surveys what we have learned during the last ten years about the lattice $\lambda \mathcal{T}$ of all $\lambda$-theories (= equational extensions of untyped $\lambda$-calculus), via the sets $\lambda \mathcal{C}$ consisting of the $\lambda$-theories that are representable in a uniform class $\mathcal{C}$ of $\lambda$-models. This includes positive answers to several questions raised in Berline (2000), as well as several independent results, the state of the art on the long-standing open questions concerning the representability of $\lambda _{\beta},\lambda _{\beta\eta}$, $H$ as theories of models, and 22 open problems.We will focus on the class $\mathcal{G}$ of graph models, since almost all the existing semantic proofs on $\lambda \mathcal{T}$ have been, or could be, more easily, obtained via graph models, or slight variations of them. But in this paper we will also give some evidence that, for all uniform classes $\mathcal{C},\mathcal{C}^{\prime}$ of proper $\lambda$-models living in functional semantics, $\lambda \mathcal{C}-\lambda \mathcal{C}^{\prime}$ should have cardinality $2^{\omega }$, provided $ \mathcal{C}$ is not included in $\mathcal{C}^{\prime}.$

Journal ArticleDOI
Dag Normann1
TL;DR: It is shown that the extensional ordering of the sequential functionals of pure type 3, for example, as defined via game semantics, is not cpo-enriched, and that this model does not equal Milner's fully abstract model for PCF.
Abstract: We show that the extensional ordering of the sequential functionals of pure type 3, for example, as defined via game semantics (Abramsky et al. 1994; Hyland and Ong 2000), is not cpo-enriched. This shows that this model does not equal Milner's (Milner 1977) fully abstract model for PCF.

Journal ArticleDOI
TL;DR: Two solutions to polytime computation are found: the first is obtained by a simple extension of Danos and Joinet's condition, closely resembles Asperti's Light Affine Logic and enjoys polystep strong normalisation (the polynomial bound does not depend on the reduction strategy).
Abstract: Light and Elementary Linear Logic, which form key components of the interface between logic and implicit computational complexity, were originally introduced by Girard as ‘stand-alone’ logical systems with a (somewhat awkward) sequent calculus of their own. The latter was later reformulated by Danos and Joinet as a proper subsystem of linear logic, whose proofs satisfy a certain structural condition. We extend this approach to polytime computation, finding two solutions: the first is obtained by a simple extension of Danos and Joinet's condition, closely resembles Asperti's Light Affine Logic and enjoys polystep strong normalisation (the polynomial bound does not depend on the reduction strategy); the second, which needs more complex conditions, exactly corresponds to Girard's Light Linear Logic.

Journal ArticleDOI
TL;DR: This work describes an implementation of classical combinatory logic in a reversible calculus for which it presents an algebraic model based on a generalisation of the notion of a group.
Abstract: The $\lambda$-calculus is destructive: its main computational mechanism, beta reduction, destroys the redex, which makes replaying the computational steps impossible. Combinatory logic is a variant of the $\lambda$-calculus that maintains irreversibility. Recently, reversible computational models have been studied mainly in the context of quantum computation, as (without measurements) quantum physics is inherently reversible. However, reversibility also fundamentally changes the semantical framework in which classical computation has to be investigated. We describe an implementation of classical combinatory logic in a reversible calculus for which we present an algebraic model based on a generalisation of the notion of a group.


Journal ArticleDOI
TL;DR: A simpler proof of one of the major results in this area – the theorem of Yu and Ding, which states that there exists no cl-complete c.
Abstract: Lipschitz continuity is used as a tool for analysing the relationship between incomputability and randomness. We present a simpler proof of one of the major results in this area – the theorem of Yu and Ding, which states that there exists no cl-complete c.e. real – and go on to consider the global theory. The existential theory of the cl degrees is decidable, but this does not follow immediately by the standard proof for classical structures, such as the Turing degrees, since the cl degrees are a structure without join. We go on to show that strictly below every random cl degree there is another random cl degree. Results regarding the phenomenon of quasi-maximality in the cl degrees are also presented.

Journal ArticleDOI
TL;DR: A formalism called addressed term rewriting systems is presented, which can be used to model implementations of theorem proving, symbolic computation and programming languages, especially aspects of sharing, recursive computations and cyclic data structures.
Abstract: We present a formalism called addressed term rewriting systems, which can be used to model implementations of theorem proving, symbolic computation and programming languages, especially aspects of sharing, recursive computations and cyclic data structures. Addressed Term Rewriting Systems are therefore well suited to describing object-based languages, and as an example we present a language called $\lambda{\cal O}bj^{a}$, incorporating both functional and object-based features. As a case study in how reasoning about languages is supported in the ATRS formalism, we define a type system for $\lambda{\cal O}bj^{a}$ and prove a type soundness result.