scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

A nonstandard standardization theorem

TL;DR: This paper focuses on standardization for the linear substitution calculus, a calculus with ES capable of mimicking reduction in lambda-calculus and linear logic proof-nets, and relies on Gonthier, Lévy, and Melliès' axiomatic theory for standardization.
Abstract: Standardization is a fundamental notion for connecting programming languages and rewriting calculi. Since both programming languages and calculi rely on substitution for defining their dynamics, explicit substitutions (ES) help further close the gap between theory and practice.This paper focuses on standardization for the linear substitution calculus, a calculus with ES capable of mimicking reduction in lambda-calculus and linear logic proof-nets. For the latter, proof-nets can be formalized by means of a simple equational theory over the linear substitution calculus.Contrary to other extant calculi with ES, our system can be equipped with a residual theory in the sense of Levy, which is used to prove a left-to-right standardization theorem for the calculus with ES but without the equational theory. Such a theorem, however, does not lift from the calculus with ES to proof-nets, because the notion of left-to-right derivation is not preserved by the equational theory. We then relax the notion of left-to-right standard derivation, based on a total order on redexes, to a more liberal notion of standard derivation based on partial orders.Our proofs rely on Gonthier, Levy, and Mellies' axiomatic theory for standardization. However, we go beyond merely applying their framework, revisiting some of its key concepts: we obtain uniqueness (modulo) of standard derivations in an abstract way and we provide a coinductive characterization of their key abstract notion of external redex. This last point is then used to give a simple proof that linear head reduction --a nondeterministic strategy having a central role in the theory of linear logic-- is standard.
Citations
More filters
Book ChapterDOI
01 Jan 2002
TL;DR: This chapter presents the basic concepts of term rewriting that are needed in this book and suggests several survey articles that can be consulted.
Abstract: In this chapter we will present the basic concepts of term rewriting that are needed in this book. More details on term rewriting, its applications, and related subjects can be found in the textbook of Baader and Nipkow [BN98]. Readers versed in German are also referred to the textbooks of Avenhaus [Ave95], Bundgen [Bun98], and Drosten [Dro89]. Moreover, there are several survey articles [HO80, DJ90, Klo92, Pla93] that can also be consulted.

501 citations

Proceedings ArticleDOI
19 Aug 2014
TL;DR: The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic, and shows that the LSC is a complexity-preserving abstraction of abstract machines.
Abstract: It is well-known that many environment-based abstract machines can be seen as strategies in lambda calculi with explicit substitutions (ES). Recently, graphical syntaxes and linear logic led to the linear substitution calculus (LSC), a new approach to ES that is halfway between small-step calculi and traditional calculi with ES. This paper studies the relationship between the LSC and environment-based abstract machines. While traditional calculi with ES simulate abstract machines, the LSC rather distills them: some transitions are simulated while others vanish, as they map to a notion of structural congruence. The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic. We show that such a pattern applies uniformly in call-by-name, call-by-value, and call-by-need, catching many machines in the literature. We start by distilling the KAM, the CEK, and a sketch of the ZINC, and then provide simplified versions of the SECD, the lazy KAM, and Sestoft's machine. Along the way we also introduce some new machines with global environments. Moreover, we show that distillation preserves the time complexity of the executions, i.e. the LSC is a complexity-preserving abstraction of abstract machines.

77 citations


Cites background or methods from "A nonstandard standardization theor..."

  • ...Explicit substitutions (ES) have been connected to linear logic by Kesner and co-authors in a sequence of works [26, 32, 33], culminating in the linear substitution calculus (LSC), a new formalism with ES behaviorally isomorphic to proof nets (introduced in [6], developed in [1, 3, 4, 7, 10], and bearing similarities with calculi by De Bruijn [25], Nederpelt [42], and Milner [41])....

    [...]

  • ...Plotkin’s Approach: our study complements the recent [10], where it is shown that WLHR is a standard strategy of the LSC....

    [...]

  • ...The presentation in use here has already appeared in [3, 10] (see also [1, 4]) as the weak head strategy of the linear substitution calculus (which is obtained by considering all contexts as evaluation contexts), and it avoids many technicalities of the original one....

    [...]

Journal ArticleDOI
TL;DR: This article explores the use of non-idempotent intersection types in the framework of the λ-calculus by replacing the reducibility technique with trivial combinatorial arguments.
Abstract: This article explores the use of non-idempotent intersection types in the framework of the λ-calculus. Different topics are presented in a uniform framework: head normalization, weak normalization, weak head normalization, strong normalization, inhabitation, exact bounds and principal typings. The reducibility technique, traditionally used when working with idempotent types, is replaced in this framework by trivial combinatorial arguments.

53 citations


Cites background from "A nonstandard standardization theor..."

  • ...For example, linear-head, head, weak and strong normalization are characterized in [40] by means of appropriate non-idempotent types in the framework of the linear substitution calculus [1], a calculus with explicit substitution at a distance....

    [...]

Proceedings ArticleDOI
14 Jul 2014
TL;DR: The main technical contribution of the paper is indeed the definition of useful reductions and the thorough analysis of their properties, and the first complete positive answer to this long-standing problem of λ-calculus.
Abstract: Slot and van Emde Boas' weak invariance thesis states that reasonable machines can simulate each other within a polynomially overhead in time. Is λ-calculus a reasonable machine? Is there a way to measure the computational complexity of a λ-term? This paper presents the first complete positive answer to this long-standing problem. Moreover, our answer is completely machine-independent and based over a standard notion in the theory of λ-calculus: the length of a leftmost-outermost derivation to normal form is an invariant cost model. Such a theorem cannot be proved by directly relating λ-calculus with Turing machines or random access machines, because of the size explosion problem: there are terms that in a linear number of steps produce an exponentially long output. The first step towards the solution is to shift to a notion of evaluation for which the length and the size of the output are linearly related. This is done by adopting the linear substitution calculus (LSC), a calculus of explicit substitutions modelled after linear logic proof nets and admitting a decomposition of leftmost-outermost derivations with the desired property. Thus, the LSC is invariant with respect to, say, random access machines. The second step is to show that LSC is invariant with respect to the λ-calculus. The size explosion problem seems to imply that this is not possible: having the same notions of normal form, evaluation in the LSC is exponentially longer than in the λ-calculus. We solve such an impasse by introducing a new form of shared normal form and shared reduction, deemed useful. Useful evaluation avoids those steps that only unshare the output without contributing to β-redexes, i.e. the steps that cause the blow-up in size. The main technical contribution of the paper is indeed the definition of useful reductions and the thorough analysis of their properties.

48 citations


Cites background or methods or result from "A nonstandard standardization theor..."

  • ...While the study of standardisation for the LSC [6] uses the string approach (and thus only talks about the left-to-right order and the leftmost redex), here some of the proofs (see the long version [2]) require a delicate analysis of the relative positions of redexes and so we prefer the more informative tree approach and define the order formally....

    [...]

  • ...Using the lemma above and a technical property of standard derivations (the enclave axiom, see [6]) we obtain:...

    [...]

  • ...In [6] it is showed that in the full LSC standard derivations are complete, i....

    [...]

  • ...As in our previous work [3], we prove our result by means of the linear substitution calculus (see also [1, 6]), a simple calculus of explicit substitutions (ES, for short) arising from linear logic and graphical syntaxes and similar to calculi studied by De Bruijn [16], Nederpelt [22], and Milner [21]....

    [...]

  • ...Unfortunately, the linear LO strategy, noted →LO and first defined in [6], is mechanisable but the pair (→LOβ ,→LO) is not a high-level implementation system....

    [...]

References
More filters
Journal ArticleDOI
30 Jan 1987

3,947 citations

Journal ArticleDOI
TL;DR: This column presents an intuitive overview of linear logic, some recent theoretical results, and summarizes several applications oflinear logic to computer science.
Abstract: Linear logic was introduced by Girard in 1987 [11] . Since then many results have supported Girard' s statement, \"Linear logic is a resource conscious logic,\" and related slogans . Increasingly, computer scientists have come to recognize linear logic as an expressive and powerful logic with connection s to a variety of topics in computer science . This column presents a.n intuitive overview of linear logic, some recent theoretical results, an d summarizes several applications of linear logic to computer science . Other introductions to linear logic may be found in [12, 361 .

2,304 citations


"A nonstandard standardization theor..." refers methods in this paper

  • ...The linear substitution calculus has been designed to mimic the representation of λ-calculus in linear logic proofnets [18], where β-reduction is decomposed into small steps....

    [...]

Journal ArticleDOI
TL;DR: This paper examines the old question of the relationship between ISWIM and the λ-calculus, using the distinction between call-by-value and call- by-name, and finds that operational equality is not preserved by either of the simulations.

1,240 citations

Book
01 Jan 1980
TL;DR: This proof is a direct generalization of Aczel's original proof, which is close to the well-known confluence proof for λ-calculus by Tait and Martin-Lof and gives an outline of a short proof of confluence.
Abstract: Combinatory reduction systems, or CRSs for short, were designed to combine the usual first-order format of term rewriting with the presence of bound variables as in pure λ-calculus and various typed λ-calculi. Bound variables are also present in many other rewrite systems, such as systems with simplification rules for proof normalization. The original idea of CRSs is due to Aczel, who introduced a restricted class of CRSs and, under the assumption of orthogonality, proved confluence. Orthogonality means that the rules are nonambiguous (no overlap leading to a critical pair) and left-linear (no global comparison of terms necessary). We introduce the class of orthogonal CRSs, illustrated with many examples, discuss its expressive power and give an outline of a short proof of confluence. This proof is a direct generalization of Aczel's original proof, which is close to the well-known confluence proof for λ-calculus by Tait and Martin-Lof. There is a well-known connection between the parallel reduction featuring in the latter proof and the concept of “developments”, and a classical lemma in the theory of λ-calculus is that of “finite developments”, a strong normalization result. It turns out that the notion of “parallel reduction” used in Aczel's proof gives rise to a generalized form of developments which we call “superdevelopments” and on which we will briefly comment.

662 citations

Proceedings ArticleDOI
01 Dec 1989
TL;DR: The λ&sgr;-calculus is a refinement of the λ-Calculus where substitutions are manipulated explicitly, and provides a setting for studying the theory of substitutions, with pleasant mathematical properties.
Abstract: The ls-calculus is a refinement of the l-calculus where substitutions are manipulated explicitly. The ls-calculus provides a setting for studying the theory of substitutions, with pleasant mathematical properties. It is also a useful bridge between the classical l-calculus and concrete implementations.

577 citations


"A nonstandard standardization theor..." refers background in this paper

  • ...In order to give an intuition on such a phenomenon let us consider a calculus with ES such as λσ [1] or λx [9] containing at least the following reduction rules:...

    [...]