scispace - formally typeset
Search or ask a question
Author

Carlos Lombardi

Bio: Carlos Lombardi is an academic researcher from National University of Quilmes. The author has contributed to research in topics: Rewriting & Pattern calculus. The author has an hindex of 4, co-authored 15 publications receiving 91 citations.

Papers
More filters
Proceedings ArticleDOI
08 Jan 2014
TL;DR: This paper focuses on standardization for the linear substitution calculus, a calculus with ES capable of mimicking reduction in lambda-calculus and linear logic proof-nets, and relies on Gonthier, Lévy, and Melliès' axiomatic theory for standardization.
Abstract: Standardization is a fundamental notion for connecting programming languages and rewriting calculi. Since both programming languages and calculi rely on substitution for defining their dynamics, explicit substitutions (ES) help further close the gap between theory and practice.This paper focuses on standardization for the linear substitution calculus, a calculus with ES capable of mimicking reduction in lambda-calculus and linear logic proof-nets. For the latter, proof-nets can be formalized by means of a simple equational theory over the linear substitution calculus.Contrary to other extant calculi with ES, our system can be equipped with a residual theory in the sense of Levy, which is used to prove a left-to-right standardization theorem for the calculus with ES but without the equational theory. Such a theorem, however, does not lift from the calculus with ES to proof-nets, because the notion of left-to-right derivation is not preserved by the equational theory. We then relax the notion of left-to-right standard derivation, based on a total order on redexes, to a more liberal notion of standard derivation based on partial orders.Our proofs rely on Gonthier, Levy, and Mellies' axiomatic theory for standardization. However, we go beyond merely applying their framework, revisiting some of its key concepts: we obtain uniqueness (modulo) of standard derivations in an abstract way and we provide a coinductive characterization of their key abstract notion of external redex. This last point is then used to give a simple proof that linear head reduction --a nondeterministic strategy having a central role in the theory of linear logic-- is standard.

61 citations

Proceedings ArticleDOI
01 Jan 2012
TL;DR: This paper defines a (multistep) strategy for PPC and shows that it is normalising, which generalises the leftmost-outermost strategy for -calculus and is strictly finer than parallel- outermost.
Abstract: The Pure Pattern Calculus (PPC) [10, 11] extends the -calculus, as well as the family of algebraic pattern calculi [20, 6, 12], with first-class patterns i.e. patterns can be passed as arguments, evaluated and returned as results. The notion of matching failure of PPC in [11] not only provides a mechanism to define functions by pattern matching on cases but also supplies PPC with parallelor-like, non-sequential behaviour. Therefore, devising normalising strategies for PPC to obtain well-behaved implementations turns out to be challenging. This paper focuses on normalising reduction strategies for PPC. We define a (multistep) strategy and show that it is normalising. The strategy generalises the leftmost-outermost strategy for -calculus and is strictly finer than parallel-outermost. The normalisation proof is based on the notion of necessary set of redexes, a generalisation of the notion of needed redex encompassing non-sequential reduction systems.

9 citations

Posted Content
TL;DR: A normalisation theorem is proved that multistep strategies reducing so called necessary and never-gripping sets of redexes at a time are normalising in any ARS, a general rewriting framework encompassing many rewriting systems developed by P-A.
Abstract: We study normalisation of multistep strategies, strategies that reduce a set of redexes at a time, focussing on the notion of necessary sets, those which contain at least one redex that cannot be avoided in order to reach a normal form. This is particularly appealing in the setting of non-sequential rewrite systems, in which terms that are not in normal form may not have any needed redex. We first prove a normalisation theorem for abstract rewrite systems (ARS), a general rewriting framework encompassing many rewriting systems developed by P-A.Mellies in his PhD thesis. The theorem states that multistep strategies reducing so called necessary and never-gripping sets of redexes at a time are normalising in any ARS. Gripping refers to an abstract property reflecting the behavior of higher-order substitution. We then apply this result to the particular case of PPC, a calculus of patterns and to the lambda-calculus with parallel-or.

5 citations

Posted Content
TL;DR: A normalisation theorem is proved that multistep strategies reducing so called necessary and non-gripping sets of redexes at a time are normalising in any ARS, including the Pure Pattern Calculus and the lambda-calculus with parallel-or.
Abstract: We study normalisation of multistep strategies, strategies that reduce a set of redexes at a time, focussing on the notion of necessary sets, those which contain at least one redex that cannot be avoided in order to reach a normal form. This is particularly appealing in the setting of non-sequential rewrite systems, in which terms that are not in normal form may not have any \emph{needed} redex. We first prove a normalisation theorem for abstract rewrite systems or ARS, a general rewriting framework encompassing many rewriting systems developed by P-A.Mellies. The theorem states that multistep strategies reducing so called necessary and non-gripping sets of redexes at a time are normalising in any ARS. Gripping refers to an abstract property reflecting the behavior of higher-order substitution. We then apply this result to the particular case of the Pure Pattern Calculus, a calculus of patterns and to the lambda-calculus with parallel-or.

4 citations

Journal ArticleDOI
TL;DR: In this article, the authors study normalisation of multistep strategies, strategies that reduce a set of redexes at a time, focusing on the notion of necessary sets, those which contain at least one redex that cannot be avoided in order to reach a normal form.

4 citations


Cited by
More filters
Journal ArticleDOI
01 Apr 1988-Nature
TL;DR: In this paper, a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) is presented.
Abstract: Deposits of clastic carbonate-dominated (calciclastic) sedimentary slope systems in the rock record have been identified mostly as linearly-consistent carbonate apron deposits, even though most ancient clastic carbonate slope deposits fit the submarine fan systems better. Calciclastic submarine fans are consequently rarely described and are poorly understood. Subsequently, very little is known especially in mud-dominated calciclastic submarine fan systems. Presented in this study are a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) that reveals a >250 m thick calciturbidite complex deposited in a calciclastic submarine fan setting. Seven facies are recognised from core and thin section characterisation and are grouped into three carbonate turbidite sequences. They include: 1) Calciturbidites, comprising mostly of highto low-density, wavy-laminated bioclast-rich facies; 2) low-density densite mudstones which are characterised by planar laminated and unlaminated muddominated facies; and 3) Calcidebrites which are muddy or hyper-concentrated debrisflow deposits occurring as poorly-sorted, chaotic, mud-supported floatstones. These

9,929 citations

Book ChapterDOI
01 Jan 2002
TL;DR: This chapter presents the basic concepts of term rewriting that are needed in this book and suggests several survey articles that can be consulted.
Abstract: In this chapter we will present the basic concepts of term rewriting that are needed in this book. More details on term rewriting, its applications, and related subjects can be found in the textbook of Baader and Nipkow [BN98]. Readers versed in German are also referred to the textbooks of Avenhaus [Ave95], Bundgen [Bun98], and Drosten [Dro89]. Moreover, there are several survey articles [HO80, DJ90, Klo92, Pla93] that can also be consulted.

501 citations

Proceedings ArticleDOI
19 Aug 2014
TL;DR: The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic, and shows that the LSC is a complexity-preserving abstraction of abstract machines.
Abstract: It is well-known that many environment-based abstract machines can be seen as strategies in lambda calculi with explicit substitutions (ES). Recently, graphical syntaxes and linear logic led to the linear substitution calculus (LSC), a new approach to ES that is halfway between small-step calculi and traditional calculi with ES. This paper studies the relationship between the LSC and environment-based abstract machines. While traditional calculi with ES simulate abstract machines, the LSC rather distills them: some transitions are simulated while others vanish, as they map to a notion of structural congruence. The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic. We show that such a pattern applies uniformly in call-by-name, call-by-value, and call-by-need, catching many machines in the literature. We start by distilling the KAM, the CEK, and a sketch of the ZINC, and then provide simplified versions of the SECD, the lazy KAM, and Sestoft's machine. Along the way we also introduce some new machines with global environments. Moreover, we show that distillation preserves the time complexity of the executions, i.e. the LSC is a complexity-preserving abstraction of abstract machines.

77 citations

Journal ArticleDOI
TL;DR: This article explores the use of non-idempotent intersection types in the framework of the λ-calculus by replacing the reducibility technique with trivial combinatorial arguments.
Abstract: This article explores the use of non-idempotent intersection types in the framework of the λ-calculus. Different topics are presented in a uniform framework: head normalization, weak normalization, weak head normalization, strong normalization, inhabitation, exact bounds and principal typings. The reducibility technique, traditionally used when working with idempotent types, is replaced in this framework by trivial combinatorial arguments.

53 citations