scispace - formally typeset
Search or ask a question
Author

Stéphane Lengrand

Other affiliations: University of Paris
Bio: Stéphane Lengrand is an academic researcher from University of St Andrews. The author has contributed to research in topics: Cut-elimination theorem & Intuitionistic logic. The author has an hindex of 3, co-authored 5 publications receiving 95 citations. Previous affiliations of Stéphane Lengrand include University of Paris.

Papers
More filters
Journal ArticleDOI
TL;DR: The operational behaviour of the calculus and some of its fundamental properties such as confluence, preservation of strongnormalisation, strong normalisation of simply typed terms, step by step simulation of @b-reduction and full composition are shown.
Abstract: We present a simple term calculus with an explicit control of erasure and duplication of substitutions, enjoying a sound and complete correspondence with the intuitionistic fragment of Linear Logic's proof-nets. We show the operational behaviour of the calculus and some of its fundamental properties such as confluence, preservation of strong normalisation, strong normalisation of simply typed terms, step by step simulation of @b-reduction and full composition.

51 citations

Book ChapterDOI
19 Apr 2005
TL;DR: A simple term language with explicit operators for erasure, duplication and substitution enjoying a sound and complete correspondence with the intuitionistic fragment of Linear Logic's Proof Nets is presented.
Abstract: We present a simple term language with explicit operators for erasure, duplication and substitution enjoying a sound and complete correspondence with the intuitionistic fragment of Linear Logic's Proof Nets. We establish the good operational behaviour of the language by means of some fundamental properties such as confluence, preservation of strong normalisation, strong normalisation of well-typed terms and step by step simulation. This formalism is the first term calculus with explicit substitutions having full composition and preserving strong normalisation.

31 citations

Journal ArticleDOI
TL;DR: It is proved that orthogonality does not capture the fixpoint construction of symmetric candidates, and the whole calculus is strongly normalising, which relates the calculus to the traditional system F ω, also when the latter is extended with axioms for classical logic.

12 citations

Journal Article
TL;DR: In this paper, the Curry-Howard correspondence is used to normalize the normalization procedures in the depth-bounded intuitionistic sequent calculus of Hudelmaier for the implicational case.
Abstract: Inspired by the Curry-Howard correspondence, we study normalisation procedures in the depth-bounded intuitionistic sequent calculus of Hudelmaier (1988) for the implicational case. thus strengthening existing approaches to Cut-admissibility. We decorate proofs with terms and introduce various term-reduction systems representing proof transformations. In contrast to previous papers which gave different arguments for Cut-admissibility suggesting weakly normalising procedures for Cut-elimination, our main reduction system and all its variations are strongly normalising, with the variations corresponding to different optimisations, some of them with good properties such as confluence.

2 citations

Book ChapterDOI
17 Aug 2006
TL;DR: In this paper, the Curry-Howard correspondence was used for normalization procedures in the depth-bounded intuitionistic sequent calculus of Hudelmaier (1988) for the implicational case.
Abstract: Inspired by the Curry-Howard correspondence, we study normalisation procedures in the depth-bounded intuitionistic sequent calculus of Hudelmaier (1988) for the implicational case, thus strengthening existing approaches to Cut-admissibility. We decorate proofs with terms and introduce various term-reduction systems representing proof transformations. In contrast to previous papers which gave different arguments for Cut-admissibility suggesting weakly normalising procedures for Cut-elimination, our main reduction system and all its variations are strongly normalising, with the variations corresponding to different optimisations, some of them with good properties such as confluence.

1 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Term Rewriting and All That is a self-contained introduction to the field of term rewriting and covers all the basic material including abstract reduction systems, termination, confluence, completion, and combination problems.
Abstract: Term Rewriting and All That is a self-contained introduction to the field of term rewriting. The book starts with a simple motivating example and covers all the basic material including abstract reduction systems, termination, confluence, completion, and combination problems. Some closely connected subjects, such as universal algebra, unification theory, Grobher bases, and Buchberger's algorithm, are also covered.

99 citations

Proceedings ArticleDOI
19 Aug 2014
TL;DR: The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic, and shows that the LSC is a complexity-preserving abstraction of abstract machines.
Abstract: It is well-known that many environment-based abstract machines can be seen as strategies in lambda calculi with explicit substitutions (ES). Recently, graphical syntaxes and linear logic led to the linear substitution calculus (LSC), a new approach to ES that is halfway between small-step calculi and traditional calculi with ES. This paper studies the relationship between the LSC and environment-based abstract machines. While traditional calculi with ES simulate abstract machines, the LSC rather distills them: some transitions are simulated while others vanish, as they map to a notion of structural congruence. The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic. We show that such a pattern applies uniformly in call-by-name, call-by-value, and call-by-need, catching many machines in the literature. We start by distilling the KAM, the CEK, and a sketch of the ZINC, and then provide simplified versions of the SECD, the lazy KAM, and Sestoft's machine. Along the way we also introduce some new machines with global environments. Moreover, we show that distillation preserves the time complexity of the executions, i.e. the LSC is a complexity-preserving abstraction of abstract machines.

77 citations

Proceedings ArticleDOI
08 Jan 2014
TL;DR: This paper focuses on standardization for the linear substitution calculus, a calculus with ES capable of mimicking reduction in lambda-calculus and linear logic proof-nets, and relies on Gonthier, Lévy, and Melliès' axiomatic theory for standardization.
Abstract: Standardization is a fundamental notion for connecting programming languages and rewriting calculi. Since both programming languages and calculi rely on substitution for defining their dynamics, explicit substitutions (ES) help further close the gap between theory and practice.This paper focuses on standardization for the linear substitution calculus, a calculus with ES capable of mimicking reduction in lambda-calculus and linear logic proof-nets. For the latter, proof-nets can be formalized by means of a simple equational theory over the linear substitution calculus.Contrary to other extant calculi with ES, our system can be equipped with a residual theory in the sense of Levy, which is used to prove a left-to-right standardization theorem for the calculus with ES but without the equational theory. Such a theorem, however, does not lift from the calculus with ES to proof-nets, because the notion of left-to-right derivation is not preserved by the equational theory. We then relax the notion of left-to-right standard derivation, based on a total order on redexes, to a more liberal notion of standard derivation based on partial orders.Our proofs rely on Gonthier, Levy, and Mellies' axiomatic theory for standardization. However, we go beyond merely applying their framework, revisiting some of its key concepts: we obtain uniqueness (modulo) of standard derivations in an abstract way and we provide a coinductive characterization of their key abstract notion of external redex. This last point is then used to give a simple proof that linear head reduction --a nondeterministic strategy having a central role in the theory of linear logic-- is standard.

61 citations

Book ChapterDOI
Delia Kesner1
11 Sep 2007
TL;DR: Very simple technology is used to establish a general theory of explicit substitutions for the lambda-calculus which enjoys fundamental properties such as simulation of one-step beta-reduction, confluence on metaterms, preservation of beta-strong normalisation, strong normalisation of typed terms and full composition.
Abstract: Calculi with explicit substitutions (ES) are widely used in different areas of computer science. Complex systems with ES were developed these last 15 years to capture the good computational behaviour of the original systems (with meta-level substitutions) they were implementing. In this paper we first survey previous work in the domain by pointing out the motivations and challenges that guided the development of such calculi. Then we use very simple technology to establish a general theory of explicit substitutions for the lambda-calculus which enjoys fundamental properties such as simulation of one-step beta-reduction, confluence on metaterms, preservation of beta-strong normalisation, strong normalisation of typed terms and full composition. The calculus also admits a natural translation into Linear Logic's proof-nets.

53 citations

Book ChapterDOI
07 Sep 2009
TL;DR: A polarised variant of Curien and Herbelin's λµµ calculus suitable for sequent calculi that admit a focalising cut elimination, which gives a setting in which Krivine's classical realisability extends naturally, with a presentation in terms of orthogonality.
Abstract: We develop a polarised variant of Curien and Herbelin's λµµ calculus suitable for sequent calculi that admit a focalising cut elimination (i.e. whose proofs are focalised when cut-free), such as Girard's classical logic LC or linear logic. This gives a setting in which Krivine's classical realisability extends naturally (in particular to call-byvalue), with a presentation in terms of orthogonality. We give examples of applications to the theory of programming languages.

52 citations