scispace - formally typeset
Search or ask a question

Showing papers on "Linear logic published in 2011"


Journal ArticleDOI
TL;DR: A probabilistic version of coherence spaces is studied and it is shown that these objects provide a model of linear logic and the semantics of Probabilistic PCF closed terms of ground type is given.
Abstract: We study a probabilistic version of coherence spaces and show that these objects provide a model of linear logic. We build a model of the pure lambda-calculus in this setting and show how to interpret a probabilistic version of the functional language PCF. We give a probabilistic interpretation of the semantics of probabilistic PCF closed terms of ground type. Last we suggest a generalization of this approach, using Banach spaces.

110 citations


Book
15 Sep 2011
TL;DR: In this paper, a course on logic by one of the world's leading proof theorists challenges mathematicians, computer scientists, physicists and philosophers to rethink their views and concepts on the nature of mathematical knowledge in an exceptionally profound way.
Abstract: These lectures on logic, more specifically proof theory, are basically intended for postgraduate students and researchers in logic. The question at stake is the nature of mathematical knowledge and the difference between a question and an answer, i.e., the implicit and the explicit. The problem is delicate mathematically and philosophically as well: the relation between a question and its answer is a sort of equality where one side is “more equal than the other”: one thus discovers essentialist blind spots. Starting with Godel’s paradox (1931) – so to speak, the incompleteness of answers with respect to questions – the book proceeds with paradigms inherited from Gentzen’s cut-elimination (1935). Various settings are studied: sequent calculus, natural deduction, lambda calculi, category-theoretic composition, up to geometry of interaction (GoI), all devoted to explicitation, which eventually amounts to inverting an operator in a von Neumann algebra. Mathematical language is usually described as referring to a preexisting reality. Logical operations can be given an alternative procedural meaning: typically, the operators involved in GoI are invertible, not because they are constructed according to the book, but because logical rules are those ensuring invertibility. Similarly, the durability of truth should not be taken for granted: one should distinguish between imperfect (perennial) and perfect modes. The procedural explanation of the infinite thus identifies it with the unfinished, i.e., the perennial. But is perenniality perennial? This questioning yields a possible logical explanation for algorithmic complexity. This highly original course on logic by one of the world’s leading proof theorists challenges mathematicians, computer scientists, physicists and philosophers to rethink their views and concepts on the nature of mathematical knowledge in an exceptionally profound way.

84 citations


Journal ArticleDOI
26 Jan 2011
TL;DR: This work introduces a prototype implementation of Alms, and establishes the soundness of the core language, and uses the core model to prove a principal kinding theorem.
Abstract: Alms is a general-purpose programming language that supports practical affine types. To offer the expressiveness of Girard's linear logic while keeping the type system light and convenient, Alms uses expressive kinds that minimize notation while maximizing polymorphism between affine and unlimited types.A key feature of Alms is the ability to introduce abstract affine types via ML-style signature ascription. In Alms, an interface can impose stiffer resource usage restrictions than the principal usage restrictions of its implementation. This form of sealing allows the type system to naturally and directly express a variety of resource management protocols from special-purpose type systems.We present two pieces of evidence to demonstrate the validity of our design goals. First, we introduce a prototype implementation of Alms and discuss our experience programming in the language. Second, we establish the soundness of the core language. We also use the core model to prove a principal kinding theorem.

80 citations


Journal ArticleDOI
TL;DR: dlPCF is not only able to precisely capture the functional behaviour of PCF programs but also some of their intensional properties, namely the complexity of evaluating them with Krivine's Machine.
Abstract: A system of linear dependent types for the lambda calculus with full higher-order recursion, called dlPCF, is introduced and proved sound and relatively complete. Completeness holds in a strong sense: dlPCF is not only able to precisely capture the functional behaviour of PCF programs (i.e. how the output relates to the input) but also some of their intensional properties, namely the complexity of evaluating them with Krivine's Machine. dlPCF is designed around dependent types and linear logic and is parametrized on the underlying language of index terms, which can be tuned so as to sacrifice completeness for tractability.

55 citations


Journal ArticleDOI
19 Sep 2011
TL;DR: A denotational model for graphical user interface (GUI) programming using the Cartesian closed category of ultrametric spaces, and implement this DSL as an extension to OCaml, and give examples demonstrating that programs in this style can be short and readable.
Abstract: We give a denotational model for graphical user interface (GUI) programming using the Cartesian closed category of ultrametric spaces. The ultrametric structure enforces causality restrictions on reactive systems and allows well-founded recursive definitions by a generalization of guardedness. We capture the arbitrariness of user input (e.g., a user gets to decide the stream of clicks she sends to a program) by making use of the fact that the closed subsets of an ultrametric space themselves form an ultrametric space, allowing us to interpret nondeterminism with a "powerspace" monad.Algebras for the powerspace monad yield a model of intuitionistic linear logic, which we exploit in the definition of a mixed linear/non-linear domain-specific language for writing GUI programs. The non-linear part of the language is used for writing reactive stream-processing functions whilst the linear sublanguage naturally captures the generativity and usage constraints on the various linear objects in GUIs, such as the elements of a DOM or scene graph.We have implemented this DSL as an extension to OCaml, and give examples demonstrating that programs in this style can be short and readable.

47 citations


Journal ArticleDOI
TL;DR: Normalization of the exponential reduction and confluence of the full one is proved and a translation of Boudol?s untyped ?-calculus with resources extended with a linear?nonlinear reduction a la Ehrhard and Regnier?s differential ?

44 citations


Journal ArticleDOI
TL;DR: The splitting theorem shows how and to what extent the authors can recover a sequent-like structure in NEL proofs and yields a cut-elimination procedure for NEL.
Abstract: System NEL is the mixed commutative/non-commutative linear logic BV augmented with linear logic's exponentials, or, equivalently, it is MELL augmented with the non-commutative self-dual connective seq. NEL is presented in deep inference, because no Gentzen formalism can express it in such a way that the cut rule is admissible. Other recent work shows that system NEL is Turing-complete, and is able to express process algebra sequential composition directly and model causal quantum evolution faithfully. In this paper, we show cut elimination for NEL , based on a technique that we call splitting. The splitting theorem shows how and to what extent we can recover a sequent-like structure in NEL proofs. When combined with a 'decomposition' theorem, proved in the previous paper of this series, splitting yields a cut-elimination procedure for NEL .

31 citations


Journal ArticleDOI
TL;DR: A version of constructive linear-time temporal logic with the ''next'' temporal operator with natural deduction, sequent calculus and Hilbert-style proof systems for constructive LTL with conjunction, disjunction and falsehood, which considers Kripke semantics and proves soundness and completeness.
Abstract: In this paper we study a version of constructive linear-time temporal logic (LTL) with the ''next'' temporal operator The logic is originally due to Davies, who has shown that the proof system of the logic corresponds to a type system for binding-time analysis via the Curry-Howard isomorphism However, he did not investigate the logic itself in detail; he has proved only that the logic augmented with negation and classical reasoning is equivalent to (the ''next'' fragment of) the standard formulation of classical linear-time temporal logic We give natural deduction, sequent calculus and Hilbert-style proof systems for constructive LTL with conjunction, disjunction and falsehood, and show that the sequent calculus enjoys cut elimination Moreover, we also consider Kripke semantics and prove soundness and completeness One distinguishing feature of this logic is that distributivity of the ''next'' operator over disjunction ''@?(A@?B)@?@?A@?@?B'' is rejected in view of a type-theoretic interpretation

30 citations


Proceedings ArticleDOI
14 Sep 2011
TL;DR: This work exploits the proofs-as-processes paradigm to compose multiple Web Services specified in Classical Linear Logic, while using the expressive nature of the theorem-proving framework to provide a systematic and rigorous treatment of properties such as exceptions.
Abstract: We give an overview of a rigorous approach to Web Services composition based on theorem proving in the proof assistant HOL Light. In this, we exploit the proofs-as-processes paradigm to compose multiple Web Services specified in Classical Linear Logic, while using the expressive nature of our theorem-proving framework to provide a systematic and rigorous treatment of properties such as exceptions. The end result is not only a formally verified proof of the composition, with an associated guarantee of correctness, but also an 'executable' pi-calculus statement describing the composition in process-algebraic terms. We illustrate our approach by analyzing a non-trivial example involving numerous Web Services in a real-estate domain.

28 citations


Journal ArticleDOI
TL;DR: A basic compositionality property of NEL is shown, which leads to a cut-elimination theorem, which is proved in the next article of this series.
Abstract: We study a system, called NEL, which is the mixed commutative/noncommutative linear logic BV augmented with linear logic's exponentials. Equivalently, NEL is MELL augmented with the noncommutative self-dual connective seq. In this article, we show a basic compositionality property of NEL, which we call decomposition. This result leads to a cut-elimination theorem, which is proved in the next article of this series. To control the induction measure for the theorem, we rely on a novel technique that extracts from NEL proofs the structure of exponentials, into what we call e-q-Flow-Graphs.

25 citations


Journal ArticleDOI
TL;DR: New, simple, proofs of soundness (every representable function lies in a given complexity class) for Elementary Affine logic, LFPL and Soft Affine Logic are presented by instantiating a semantic framework previously introduced by the authors and based on an innovative modification of realizability.

Proceedings ArticleDOI
01 Dec 2011
TL;DR: This paper introduces a two-stage Linear Logic based program synthesis approach to automatic RESTful web service composition that greatly improves the searching efficiency and guarantees the correctness and completeness of the service composition.
Abstract: This paper introduces a two-stage Linear Logic based program synthesis approach to automatic RESTful web service composition. The Linear Logic theorem proof is applied at both resource and service invocation method levels, which greatly improves the searching efficiency and guarantees the correctness and completeness of the service composition. Furthermore, the process calculus is used as formalism for the composition process, which enables the approach to be executable at the business management level. The process calculus is attached to the Linear Logic inference rules in the style of type theory, so the process model is extracted directly from the complete proof. An example is given to show the extraction of a process model from a Linear Logic proof search.

Journal ArticleDOI
TL;DR: This work shows that one can encode a wider range of proof systems by using focused linear logic with subexponentials, and identifies general conditions for determining whether a linear logic formula corresponds to an object-logic rule and whether this rule is invertible.

Book ChapterDOI
01 Jan 2011
TL;DR: It is shown that the membership problem of minimalist grammars without the shortest move constraint is as difficult as provability in Multiplicative Exponential Linear Logic and this result gives a new representation of those derivations with linear λ-terms.
Abstract: In this paper, we aim at understanding the derivations of minimalist grammars without the shortest move constraint. This leads us to study the relationship of those derivations with logic. In particular we show that the membership problem of minimalist grammars without the shortest move constraint is as difficult as provability in Multiplicative Exponential Linear Logic. As a byproduct, this result gives us a new representation of those derivations with linear λ-terms. We show how to interpret those terms in a homomorphic way so as to recover the sentence they analyse. As the homorphisms we describe are rather evolved, we turn to a proof-net representation and explain how Monadic Second Order Logic and related techniques allow us both to define those proof-nets and to retrieve the sentence they analyse.

Journal ArticleDOI
TL;DR: The main technical result of this paper is constructing a sound and complete axiomatization for the propositional fragment of computability logic whose vocabulary includes all four kinds of conjunction and disjunction: parallel, toggling, sequential and choice, together with negation.

Journal ArticleDOI
TL;DR: In this paper, the notion of bond algebras has been used to define a unified algebraic theory of duality in quantum models. But the duality transformations can be implemented as unitary transformations, or partial isometries if gauge symmetries are involved.
Abstract: An algebraic theory of dualities is developed based on the notion of bond algebras. It deals with classical and quantum dualities in a unified fashion explaining the precise connection between quantum dualities and the low temperature (strong-coupling)/high temperature (weak-coupling) dualities of classical statistical mechanics (or (Euclidean) path integrals). Its range of applications includes discrete lattice, continuum field, and gauge theories. Dualities are revealed to be local, structure-preserving mappings between model-specific bond algebras that can be implemented as unitary transformations, or partial isometries if gauge symmetries are involved. This characterization permits to search systematically for dualities and self-dualities in quantum models of arbitrary system size, dimensionality and complexity, and any classical model admitting a transfer matrix representation. Dualities like exact dimensional reduction, emergent, and gauge-reducing dualities that solve gauge constraints can be easily understood in terms of mappings of bond algebras. As a new example, we show that the (\mathbb{Z}_2) Higgs model is dual to the extended toric code model {\it in any number of dimensions}. Non-local dual variables and Jordan-Wigner dictionaries are derived from the local mappings of bond algebras. Our bond-algebraic approach goes beyond the standard approach to classical dualities, and could help resolve the long standing problem of obtaining duality transformations for lattice non-Abelian models. As an illustration, we present new dualities in any spatial dimension for the quantum Heisenberg model. Finally, we discuss various applications including location of phase boundaries, spectral behavior and, notably, we show how bond-algebraic dualities help constrain and realize fermionization in an arbitrary number of spatial dimensions.

Journal ArticleDOI
TL;DR: First-order logic is a good starting point, both from the representation and inference point of view, but even if one makes the choice of firstorder logic as representation language, this is not enough: the computational semanticist needs to make further decisions on how to model events, tense, modal contexts, anaphora and plural entities as mentioned in this paper.
Abstract: The aim of computational semantics is to capture the meaning of natural language expressions in representations suitable for performing inferences, in the service of understanding human language in written or spoken form. First-order logic is a good starting point, both from the representation and inference point of view. But even if one makes the choice of first-order logic as representation language, this is not enough: the computational semanticist needs to make further decisions on how to model events, tense, modal contexts, anaphora and plural entities. Semantic representations are usually built on top of a syntactic analysis, using unification, techniques from the lambda-calculus or linear logic, to do the book-keeping of variable naming. Inference has many potential applications in computational semantics. One way to implement inference is using algorithms from automated deduction dedicated to first-order logic, such as theorem proving and model building. Theorem proving can help in finding contradictions or checking for new information. Finite model building can be seen as a complementary inference task to theorem proving, and it often makes sense to use both procedures in parallel. The models produced by model generators for texts not only show that the text is contradiction-free; they also can be used for disambiguation tasks and linking interpretation with the real world. To make interesting inferences, often additional background knowledge is required (not expressed in the analysed text or speech parts). This can be derived (and turned into first-order logic) from raw text, semistructured databases or large-scale lexical databases such as WordNet. Promising future research directions of computational semantics are investigating alternative representation and inference methods (using weaker variants of first-order logic, reasoning with defaults), and developing evaluation methods measuring the semantic adequacy of systems and formalisms.

Journal ArticleDOI
TL;DR: The paper presents in full detail the first linear algorithm given in the literature (Guerrini (1999) 6) implementing proof structure correctness for multiplicative linear logic without units.

Journal ArticleDOI
TL;DR: Three sequent calculi for bi-intuitionistic propositional logic are compared and a basic standard-style sequent calculus that restricts the pre mises of implication-right and exclusion-left inferences to be single-conclusion resp.
Abstract: Bi-intuitionistic logic is the conservative extension of intuitionistic logic with a connective dual to implication It is sometimes presented as a symmetric constructive subsystem of classical logic In this paper, we compare three sequent calculi for bi-intuitionistic propositional logic: (1) a basic standard-style sequent calculus that restricts the premises of implication-right and exclusion-left inferences to be single-conclusion resp single-assumption and is incomplete without the cut rule, (2) the calculus with nested sequents by Gore et al, where a complete class of cuts is encapsulated into special "unnest" rules and (3) a cut-free labelled sequent calculus derived from the Kripke semantics of the logic We show that these calculi can be translated into each other and discuss the ineliminable cuts of the standard-style sequent calculus

Journal ArticleDOI
TL;DR: New correctness criteria for all fragments (multiplicative, exponential, additive) of linear logic are provided for proving that deciding the correctness of a linear logic proof structure is NL-complete.

Journal ArticleDOI
TL;DR: Topological perspectives following from the dualities provide compactness theorems for the logics and the effective classification of categories of algebras involved, which tells us that Stone-type duality makes it possible to use topology for logic and algebra in significant ways.
Abstract: Stone-type duality connects logic, algebra, and topology in both conceptual and technical senses. This paper is intended to be a demonstration of this slogan. In this paper we focus on some versions of Fitting's L-valued logic and L-valued modal logic for a finite distributive lattice L. Building upon the theory of natural dualities, which is a universal algebraic theory of categorical dualities, we establish a Jonsson-Tarski-style duality for algebras of L-valued modal logic, which encompasses Jonsson-Tarski duality for modal algebras as the case L = 2. We also discuss how the dualities change when the algebras are enriched by truth constants. Topological perspectives following from the dualities provide compactness theorems for the logics and the effective classification of categories of algebras involved, which tells us that Stone-type duality makes it possible to use topology for logic and algebra in significant ways. The author is grateful to Professor Susumu Hayashi for his encouragement, to Shohei Izawa for his comments and discussions, and to Kentaro Sato for his suggesting a similar result to Theorem 2.5 for the category of algebras of Lukasiewicz n-valued logic.

Book ChapterDOI
05 Dec 2011
TL;DR: It is shown that despite its simplicity, elementary linear logic can nevertheless be used as a common framework to characterize the different levels of a hierarchy of deterministic time complexity classes, within elementary time.
Abstract: Elementary linear logic is a simple variant of linear logic, introduced by Girard and which characterizes in the proofs-as-programs approach the class of elementary functions, that is to say computable in time bounded by a tower of exponentials of fixed height. Our goal here is to show that despite its simplicity, elementary linear logic can nevertheless be used as a common framework to characterize the different levels of a hierarchy of deterministic time complexity classes, within elementary time. We consider a variant of this logic with type fixpoints and weakening. By selecting specific types we then characterize the class P of polynomial time predicates and more generally the hierarchy of classes k-EXP, for k≥0, where k-EXP is the union of DTIME $(2_k^{n^i})$ , for i≥1.

Journal ArticleDOI
TL;DR: A proof-theoretical treatment of collectively accepted group beliefs is presented through a multi-agent sequent system for an axiomatization of the logic of acceptance to achieveteness with respect to the underlying Kripke semantics.
Abstract: A proof-theoretical treatment of collectively accepted group beliefs is presented through a multi-agent sequent system for an axiomatization of the logic of acceptance. The system is based on a labelled sequent calculus for propositional multi-agent epistemic logic with labels that correspond to possible worlds and a notation for internalized accessibility relations between worlds. The system is contraction- and cut-free. Extensions of the basic system are considered, in particular with rules that allow the possibility of operative members or legislators. Completeness with respect to the underlying Kripke semantics follows from a general direct and uniform argument for labelled sequent calculi extended with mathematical rules for frame properties. As an example of the use of the calculus we present an analysis of the discursive dilemma.

Book ChapterDOI
01 Jun 2011
TL;DR: This work introduces a modal call-by-value λ-calculus with multithreading and side effects and provides a combinatorial proof of termination in elementary time for the language.
Abstract: Linear logic provides a framework to control the complexity of higher-order functional programs. We present an extension of this framework to programs with multithreading and side effects focusing on the case of elementary time. Our main contributions are as follows. First, we introduce a modal call-by-value λ-calculus with multithreading and side effects. Second, we provide a combinatorial proof of termination in elementary time for the language. Third, we introduce an elementary affine type system that guarantees the standard subject reduction and progress properties. Finally, we illustrate the programming of iterative functions with side effects in the presented formalism.

Journal ArticleDOI
TL;DR: A framework called the prismoid of resources where each vertex is a language which refines the @l-calculus by using a different choice to make explicit or implicit the definition of the contraction, weakening, and substitution operations.

Journal ArticleDOI
10 Aug 2011
TL;DR: This work presents a rigorous framework for the composition of Web Services within a higher order logic theorem prover based on the proofs-as-processes paradigm that enables inference rules of Classical Linear Logic to be translated into pi-calculus processes.
Abstract: We present a rigorous framework for the composition of Web Services within a higher order logic theorem prover. Our approach is based on the proofs-as-processes paradigm that enables inference rules of Classical Linear Logic (CLL) to be translated into p-calculus processes. In this setting, composition is achieved by representing available web services as CLL sentences, proving the requested composite service as a conjecture, and then extracting the constructed p-calculus term from the proof. Our framework, implemented in HOL Light, not only uses an expressive logic that allows us to incorporate multiple Web Services properties in the composition process, but also provides guarantees of soundness and correctness for the composition.

23 Feb 2011
TL;DR: In this article, an extension of linear logic to programs with multithreading and side effects focusing on the case of elementary time has been presented, and a new combinatorial proof of termination in elementary time for the functional case has been provided.
Abstract: Linear logic provides a framework to control the complexity of higher-order functional programs. We present an extension of this framework to programs with multithreading and side effects focusing on the case of elementary time. Our main contributions are as follows. First, we provide a new combinatorial proof of termination in elementary time for the functional case. Second, we develop an extension of the approach to a call-by-value $lambda$-calculus with multithreading and side effects. Third, we introduce an elementary affine type system that guarantees the standard subject reduction and progress properties. Finally, we illustrate the programming of iterative functions with side effects in the presented formalism.

Journal ArticleDOI
TL;DR: In this article, a structural operational semantics for concurrent constraint programming (LCC) is presented based on a label transition system and different notions of observational equivalences inspired by the state of the art of process algebras are investigated.
Abstract: Linear logic Concurrent Constraint programming (LCC) is an extension of concurrent constraint programming (CC), where the constraint system is based on Girard's linear logic instead of the classical logic. In this paper, we address the problem of program equivalence for this programming framework. For this purpose, we present a structural operational semantics for LCC based on a label transition system and investigate different notions of observational equivalences inspired by the state of art of process algebras. Then, we demonstrate that the asynchronous π-calculus can be viewed as simple syntactical restrictions of LCC. Finally, we show that LCC observational equivalences can be transposed straightforwardly to classical Concurrent Constraint languages and Constraint Handling Rules, and investigate the resulting equivalences.

Proceedings ArticleDOI
21 Jun 2011
TL;DR: In this article, a graph rewriting algorithm for additive linear logic without units is presented. But it is not a decision procedure for term equality, and it does not match the known complexity of the problem.
Abstract: Additive linear logic, the fragment of linear logic concerning linear implication between strictly additive formulae, coincides with sum-product logic, the internal language of categories with free finite products and co products. Deciding equality of its proof terms, as imposed by the categorical laws, is complicated by the presence of the units (the initial and terminal objects of the category) and the fact that in a free setting products and co products do not distribute. The best known desicion algorithm, due to Cockett and Santocanale (CSL 2009), is highly involved, requiring an intricate case analysis on the syntax of terms. This paper provides canonical, graphical representations of the categorical morphisms, yielding a novel solution to this decision problem. Starting with (a modification of) existing proof nets, due to Hughes and Van Glabbeek, for additive linear logic without units, canonical forms are obtained by graph rewriting. The rewriting algorithm is remarkably simple. As a decision procedure for term equality it matches the known complexity of the problem. A main technical contribution of the paper is the substantial correctness proof of the algorithm.

Journal ArticleDOI
11 Feb 2011
TL;DR: In this article, a simple graphical representation for proofs of intuitionistic logic is proposed, which is inspired by proof nets and interaction nets (two formalisms originating in linear logic) and inherits good features from each, but is not constrained by them.
Abstract: We offer a simple graphical representation for proofs of intuitionistic logic, which is inspired byproof nets and interaction nets (two formalisms originating in linear logic). This graphical calculusof proofs inherits good features from each, but is not constrained by them. By the Curry-Howardisomorphism, the representation applies equally to the lambda calculus, offering an alternative dia-grammatic representation of functional computations.Keywords: Intuitionistic logic, lambda-calculus, visualrepresentation, proof theory