scispace - formally typeset
Search or ask a question

Showing papers on "Denotational semantics published in 1991"


Book
01 Jun 1991
TL;DR: This paper presents an introduction to category theory possible worlds recursively-defined domains, and basic concepts: a simple imperative language a simple applicative language recursion.
Abstract: Part 1 Introduction: semantics mathematical preliminaries. Part 2 Basic concepts: a simple imperative language a simple applicative language recursion. Part 3 An Algol-like language: an Algol-like language Part 4 Advanced techniques: an introduction to category theory possible worlds recursively-defined domains.

421 citations


Book
02 Jan 1991
TL;DR: The chapter illustrates the major standard techniques that are used in denotational descriptions of programming languages, such as environments, stores, and continuations, and explains the relation between these techniques and some fundamental concepts of Programming languages.

385 citations


01 May 1991
TL;DR: In this article, the authors present a technique for recovering the control-flow graph of a Scheme program at compile time, which can be used to perform several data-flow analysis optimisations, including copy propagation, induction-variable elimination, useless variable elimination, and type recovery.
Abstract: Programs written in powerful, higher-order languages like Scheme, ML, and Common Lisp should run as fast as their FORTRAN and C counterparts. They should, but they don't. A major reason is the level of optimisation applied to these two classes of languages. Many FORTRAN and C compilers employ an arsenal of sophisticated global optimisations that depend upon data-flow analysis: common-subexpression elimination, loop-invariant detection, induction-variable elimination, and many, many more. Compilers for higher-order languages do not provide these optimisations. Without them, Scheme, LISP and ML compilers are doomed to produce code that runs slower than their FORTRAN and C counterparts. The problem is the lack of an explicit control-flow graph at compile time, something which traditional data-flow analysis techniques require. In this dissertation, I present a technique for recovering the control-flow graph of a Scheme program at compile time. I give examples of how this information can be used to perform several data-flow analysis optimisations, including copy propagation, induction-variable elimination, useless-variable elimination, and type recovery. The analysis is defined in terms of a non-standard semantic interpretation. The denotational semantics is carefully developed, and several theorems establishing the correctness of the semantics and the implementing algorithms are proven.

374 citations


Journal ArticleDOI
TL;DR: The syntax, operational semantics, and denotational semantics of a simple language that includes the type Dynamic Fmt bold>Dynamic Fmt /bold is explored and examples of howynamically typed values can be used in programming are given.
Abstract: Statically typed programming languages allow earlier error checking, better enforcement of diciplined programming styles, and the generation of more efficient object code than languages where all type consistency checks are performed at run time. However, even in statically typed languages, there is often the need to deal with datawhose type cannot be determined at compile time. To handle such situations safely, we propose to add a type Dynamic whose values are pairs of a value v and a type tag T where v has the type denoted by T. Instances of Dynamic are built with an explicit tagging construct and inspected with a type safe typecase construct.This paper explores the syntax, operational semantics, and denotational semantics of a simple language that includes the type Dynamic. We give examples of how dynamically typed values can be used in programming. Then we discuss an operational semantics for our language and obtain a soundness theorem. We present two formulations of the denotational semantics of this language and relate them to the operational semantics. Finally, we consider the implications of polymorphism and some implementation issues.

274 citations


Book
01 Jan 1991
TL;DR: This book introduces category theory at a level appropriate for computer scientists and provides practical examples in the context of programming language design and pursues the more complex mathematical semantics of data types and programs as objects and morphisms of categories.
Abstract: Category theory is a mathematical subject whose importance in several areas of computer science, most notably the semantics of programming languages and the design of programmes using abstract data types, is widely acknowledged. This book introduces category theory at a level appropriate for computer scientists and provides practical examples in the context of programming language design. "Categories, Types and structures" provides a self-contained introduction to general category theory and explains the mathematical structures that have been the foundation of language design for the past two decades. The authors observe that the language of categories could provide a powerful means of standardizing of methods and language, and offer examples ranging from the early dialects of LISP, to Edinburgh ML, to work in polymorphisms and modularity. The book familiarizes readers with categorical concepts through examples based on elementary mathematical notions such as monoids, groups and toplogical spaces, as well as elementary notions from programming-language semantics such as partial orders and categories of domains in denotational semantics. It then pursues the more complex mathematical semantics of data types and programs as objects and morphisms of categories.

263 citations


Journal ArticleDOI
TL;DR: A denotational semantics for SCCS based on the domain of synchronization trees is given, and proved fully abstract with respect to bisimulation.
Abstract: Some basic topics in the theory of concurrency are studied from the point of view of denotational semantics, and particularly the ''domain theory in logical form'' developed by the author. A domain of synchronization trees is defined by means of a recursive domain equation involving the Plotkin powerdomain. The logical counterpart of this domain is described, and shown to be related to it by Stone duality. The relationship of this domain logic to the standard Hennessy-Milner logic for transition systems is studied; the domain logic can be seen as a rational reconstruction of Hennessy-Milner logic from the standpoint of a very general and systematic theory. Finally, a denotational semantics for SCCS based on the domain of synchronization trees is given, and proved fully abstract with respect to bisimulation.

188 citations


Journal ArticleDOI
TL;DR: This paper shows that adding objects with memory to the call-by-value lambda calculus results in a language with a rich equational theory, satisfying many of the usual laws, providing evidence that expressive, mathematically clean programming languages are indeed possible.
Abstract: Traditionally the view has been that direct expression of control and store mechanisms and clear mathematical semantics are incompatible requirements. This paper shows that adding objects with memory to the call-by-value lambda calculus results in a language with a rich equational theory, satisfying many of the usual laws. Combined with other recent work, this provides evidence that expressive, mathematically clean programming languages are indeed possible.

173 citations


Book
01 Jan 1991
TL;DR: This book discusses formal methods for VDM development, transformation of VDM into mural -theories, and theories for V DM in mural.
Abstract: 1General introduction- 11 Formal methods- 12 VDM development- 13 The IPSE 25 project- 14 Proof assistant requirements- 2 Introduction to mural- 21 General introduction- 22 The proof assistant- 23 The VDM support tool- 24 Reasoning about developments- 3 Instantiation- 31 Symbolic logic in mural- 32 Classical first order predicate calculus- 33 Some common data types- 34 More complicated formulations- 35 The theory of VDM- 36 Some other logics- 4 Foundation- 41 Preamble- 42 Syntax- 43 Natural Deduction rules- 44 Rule Schemas and instantiation- 45 The mural store- 46 Syntactic contexts and well-formedness- 47 Proofs- 48 Morphisms- 49 Pattern matching- 410 Reading the full specification- 411 Limitations of the mural approach- 5 The tactic language- 51 Mechanising proof in mural- 52 The language- 53 The implementation of tactics- 54 Examples- 6 Implementing the mural proof assistant- 61 The process of implementation- 62 The implementation- 63 Lessons learnt and advice to the young- 64 The future- 65 The final word- 7 Supporting formal software development- 71 Abstract specification- 72 Relating specifications- 73 Support for reasoning about formal developments- 8 The mural VDM Support Tool- 81 Specifying VDM developments in VDM- 82 Theories from specifications- 83 Scope for growth- 9 Foundations of specification animation- 91 Approaches to animation- 92 Denotational semantics of symbolic execution- 93 Operational semantics of symbolic execution- 94 Theories to support symbolic execution- 95 Conclusions- 10 Case Studies- 101 Specifications in VDM- 102 Transformation of VDM into mural -theories- 103 A watchdog for a reactor system- 104 An algorithm for topological sorting- 105 Theories for VDM in mural- 11 Conclusions- 111 Experimental use of mural- 112 Detailed observations- 113 Further developments- 114 Summary- Appendices- A Summary of VDM Notation- B Glossary of terms- C The Specification of the Proof Assistant- C1 The Raw Syntax- C2 Subterm Access and Editing- C3 Sequents and Rules- C4 Instantiation and Pattern-matching- C5 Signatures- C6 Theories- C7 Morphisms and Theory Morphisms- C8 Proofs- C9 The Store- D The specification of the animation tool- D1 Data structure and some auxiliary functions- D2 Operations- E The Theorem Prover's House

148 citations



Book
01 Aug 1991
TL;DR: The author revealed that the semantics ofDenotational Semantics had changed significantly in the last century from what it had been in the 1950s to what it is today.
Abstract: *Preface *Introduction *Syntax *Denotational Semantics: Properties *Denotational Semantics: Applications *Denotational Semantics: Theory *Algebraic Semantics *Action Semantics: Principles *Action Semantics: Applications

125 citations


Journal ArticleDOI
TL;DR: The extended ER model is introduced concentrating nearly all concepts of known so-called semantic data models in a few syntactical constructs and a well-founded calculus is developed taking into account data operations on arbitrary user-defined data types and aggregate functions.
Abstract: Nearly all query languages discussed recently for the Entity-Relationship (ER) model do not possess a formal semantics. Languages are often defined by means of examples only. The reason for this phenomenon is the essential gap between features of query languages and theoretical foundations like algebras and calculi. Known languages offer arithmetic capabilities and allow for aggregates, but algebras and calculi defined for ER models do not.This paper introduces an extended ER model concentrating nearly all concepts of known so-called semantic data models in a few syntactical constructs. Moreover, we provide our extended ER model with a formal mathematical semantics. On this basis a well-founded calculus is developed taking into account data operations on arbitrary user-defined data types and aggregate functions. We pay special attention to arithmetic operations, as well as multivalued terms allowing nested queries, in a uniform and consistent manner. We prove our calculus only allows the formulation of safe terms and queries yielding a finite result, and to be (at least) as expressive as the relational calculi.

Book
01 Jan 1991
TL;DR: Contents: P.D. Mosses: A Practical Introduction to Denotational Semantics, E.R. Olderog: Introduction to Program Verification.
Abstract: Contents: P.D. Mosses: A Practical Introduction to Denotational Semantics.- E. Astesiano: Inductive and Operational Semantics.- D. Bjorner: Specification and Transformation: Methodology Aspects of the Vienna Development Method.- M. Wirsing: Algebraic Specification: Semantics, Parameterization and Refinement.- M. Broy: Formalization of Distributed, Concurrent, Reactive Systems.- K.R. Apt, E.-R. Olderog: Introduction to Program Verification.- L. Cardelli: Typeful Programming.

Book
01 Jan 1991


Book ChapterDOI
01 Jun 1991
TL;DR: Back-and-forth translations between the two presentations of the list, bag, and set datatypes are established, from which it follows that they are equally expressive, and results relating proofs of program properties, in the two presentation are established.
Abstract: We study issues that arise in programming with primitive recursion over non-free datatypes such as lists, bags and sets. Programs written in this style can lack a meaning in the sense that their outputs may be sensitive to the choice of input expression. We are, thus, naturally lead to a set-theoretic denotational semantics with partial functions. We set up a logic for reasoning about the definedness of terms and a deterministic and terminating evaluator. The logic is shown to be sound in the model, and its recursion free fragment is shown to be complete for proving definedness of recursion free programs. The logic is then shown to be as strong as the evaluator, and this implies that the evaluator is compatible with the provable equivalence between different set (or bag, or list) expression . Oftentimes,the same non-free datatype may have different presentations, and it is not clear a priori whether programming and reasoning with the two presentations are equivalent. We formulate these questions, precisely, in the context of alternative presentations of the list, bag, and set datatypes and study some aspects of these questions. In particular, we establish back-and-forth translations between the two presentations, from which it follows that they are equally expressive, and prove results relating proofs of program properties, in the two presentations.

Journal ArticleDOI
TL;DR: The main goal of this paper is to present a unified theory for the semantics of Horn and disjunctive logic programs by extending the fixpoint semantics and the operational or procedural semantics to the class of disjunctions logic programs and proving their equivalence using techniques similar to the ones used for Horn programs.

Book ChapterDOI
24 Sep 1991
TL;DR: Semantics for a pair of parallel object-oriented programming languages are presented by translation into the π-calculus, a foundation for the study of computational systems with evolving communication structure.
Abstract: The π-calculus provides a foundation for the study of computational systems with evolving communication structure. A system is viewed as a collection of agents which may share named communication links. Agents interact by passing to one another along shared links the names of other links. Semantics for a pair of parallel object-oriented programming languages are presented by translation into the π-calculus. The semantics are compared briefly with existing semantics of related languages.

Book
01 Jan 1991
TL;DR: An exploration of the categorical semantics of theories of dependent and polymorphic types, using the example of Coquand and Huet's calculus of constructions as an example of constructive mathematics to the problem of defining functional computer programming languages.
Abstract: An exploration of the categorical semantics of theories of dependent and polymorphic types, using the example of Coquand and Huet's calculus of constructions. The application of constructive mathematics to the problem of defining functional computer programming languages should interest mathematicia

Book ChapterDOI
02 Jan 1991
TL;DR: The chapter presents the meaning of a λ -term, which is given by translating it to a set [M], which is an element of a mathematical structure in which application and abstraction are well-defined operations.
Abstract: Publisher Summary This chapter discusses lambda calculus and explains how this system is able to capture all computable functions. Once a reduction strategy is chosen, the behavior of a term is determined. This gives a so-called operational semantics. The chapter presents the meaning of a λ -term, which is given by translating it to a set [M]. This set is an element of a mathematical structure in which application and abstraction are well-defined operations. In this way, a so-called denotational semantics is obtained.

Proceedings ArticleDOI
01 May 1991
TL;DR: The method of abstract semantic interpretations is used to explicate the control-flow analysis technique presented in "Control-How Analysis in Scheme", using a denotational semantics for CPS Scheme and presenting an alternate semantics that precisely expresses the control -flow analysis problem.
Abstract: This is a follow-on to my 1988 PLDI paper, "Control-Row Analysis in Scheme" [9]. I use the method of abstract semantic interpretations to explicate the control-flow analysis technique presented in that paper. I begin with a denotational semantics for CPS Scheme. I then present an alternate semantics that precisely expresses the control-flow analysis problem. I abstract this semantics in a natural way, arriving at two different semantic interpretations giving approximate solutions to the flow analysis problem, each computable at compile time. The development of the final abstract semantics provides a clear, formal description of the analysis technique presented in "Control-How Analysis in Scheme." This research was supported in part by the Office of Naval Research and in part by the Defense Advanced Research Projects Agency (DOD), monitored by the Office of Naval Research under Contract N00014-84-K-0415, ARPA Order No. 5404. The views and conclusions contained in this document are those of the author and should not be interpreted as representing the official policies, either expressed or implied, of ONR, DARPA or the U.S. government.

Journal ArticleDOI
Richard Helm1, Kim Marriott1
TL;DR: This work introduces a class of declarative, constraint-based picture specification languages that extend previous approaches to picture specification based on constraints and grammar formalisms and ensures they share a commonDeclarative semantics and operational semantics.
Abstract: A key issue in visual languages is the specification of the relationship between pictures and their meaning. To do this, we introduce a class of declarative, constraint-based picture specification languages. These extend previous approaches to picture specification based on constraints and grammar formalisms. As the languages are defined as instances of an abstract language scheme, we ensure they share a common declarative semantics and operational semantics. The declarative semantics allows both people and machines easily to reason about a specification independently of any underlying implementation. The operational semantics permit both the generation and recognition of pictures defined by the specification.

Book
01 Mar 1991
TL;DR: A more efficient implementation model is described that can be used when information is known about how functions use their arguments, and a semantically sound analysis technique called abstract interpretation is developed, which can determine this information, and how to use the information to compile more efficient code for sequential and parallel machines.
Abstract: The class of programming languages commonly known as functional includes Lisp, Scheme, ML, and Miranda TM. This book explores a subclass known as lazy functional languages, beginning with the theoretical issues and continuing through abstract interpretation and offering improved techniques for implementation.Now that advanced compiler technology has made it possible for lazy functional languages to compare favorably in run-time with more traditional languages such as C and Pascal, this monograph tackles problems of implementation such as time and memory overheads and restrictions on parallelism. Specifically, it describes a more efficient implementation model, the evaluation transformer model, that can be used when information is known about how functions use their arguments, develops a semantically sound analysis technique called abstract interpretation, which can determine this information, and shows how to use the information to compile more efficient code for sequential and parallel machines.Geoffrey Burn is Lecturer at Imperial College of Science, Technology, and Medicine, London.Contents: Introduction. Operational and Denotational Semantics of the Typed Lambda Calculus. A Framework for the Abstract Interpretation of Functional Languages. Some Example Abstract Interpretations. Evaluation Transformers. Implementing Functional Languages on Sequential and Parallel Machines. Relationship to Other Work. Epilogue. Appendixes: Additional Proofs. The Spineless G-Machine.

Journal ArticleDOI
TL;DR: A proof system for a version of CCS with value-passing in which the reasoning about data is factored out from that about the structure of processes is shown to be sound and complete for finite terms with respect to denotational semantics based on Acceptance Trees.
Abstract: A proof system for a version of CCS with value-passing is proposed in which the reasoning about data is factored out from that about the structure of processes. The system is shown to be sound and complete for finite terms with respect to a denotational semantics based on Acceptance Trees.

Dissertation
01 Jan 1991
TL;DR: Interpretation of Functional Languages: From theory to practice and from Theory to Practice
Abstract: Interpretation of Functional Languages: From Theory to Practice

Book ChapterDOI
26 Aug 1991
TL;DR: This paper develops a representation of composets using a novel concept of comtrace, which is certain equivalence class of step sequences, and shows that the composets represented by comtraces can be generated by generalising the standard construction of a process of a 1-safe Petri net.
Abstract: We here discuss an invariant semantics of concurrent systems which is a generalisation of the causal partial order (CPO) semantics. The new semantics is consistent with the full operational behaviour of inhibitor and priority nets expressed in terms of step sequences. It employs combined partial orders, or composets, where each composet is a relational structure consisting of a causal partial order and a weak causal partial order. In this paper we develop a representation of composets using a novel concept of comtrace, which is certain equivalence class of step sequences. The whole approach resembles to a significant extent the trace semantics introduced by Mazurkiewicz. Composets correspond to posets, comtraces correspond to traces, while step sequences correspond to interleaving sequences. The independency relation is replaced by two new relations. The first is simultaneity which is a symmetric relation comprising pairs of event which may be executed in one step. The other is serialisability which comprises pairs of events (e,f) such that if e and f can be executed in one step then they can also be executed in the order: e followed by f. We show that the comtraces enjoy essentially the same kind of properties as Mazurkiewicz traces, e.g., each comtrace is unambiguously identified by any step sequence which belongs to it. As a system model we consider Elementary Net Systems with Inhibitor Arcs (ENI-systems). We show that the comtrace model provides an invariant semantics for such nets and is in a full agreement with their operational semantics expressed in terms of step sequences. We finally show that the composets represented by comtraces can be generated by generalising the standard construction of a process of a 1-safe Petri net.

Book ChapterDOI
03 Sep 1991
TL;DR: An incremental approach to the denotational semantics of complex programming languages based on the idea of monad transformer is proposed, by translating a programming language PL in a metalanguage ML(E), where some constants do not have a fixed intended interpretation in Cpo.
Abstract: We propose an incremental approach to the denotational semantics of complex programming languages based on the idea of monad transformer. The traditional way of giving denotational semantics to a programming language is to translate it into a metalanguage ML with a fixed intended interpretation in the category Cpo of cpos (or some variant of it). We depart from this approach by translating a programming language PL in a metalanguage ML(E), where some constants do not have a fixed intended interpretation in Cpo. These constants, specified in the signature ~, include a unary type constructor T and a collection of (polymorphic) operations for constructing terms of type TA. A key property of the translation is that programs of type A are translated into terms of type T(A°), where A ° is the translation of A. This approach does not yield any new semantics for PL, since eventually one has to interpret the constants in E, e.g. by translating them into ML. However, it is an integral part of the incremental approach to be able to change the interpretation of constants in E (without invalidating adequacy of the denotational semantics w.r.t, some given operational semantics), when extending the programming language. Suppose that PL is obtained from PLo by a sequence of simple extensions PLI C PLi+I and that we have a semantics for PLo, how can we built a semantics for PL? In terms of signatures for metalanguages the problem can be rephrased as follows:

01 Jan 1991
TL;DR: It is shown how the well-founded approach also extends naturally to the same family of bilattice-based programming languages that the earlier fixpoint approaches extended to, and provides a natural semantics for logic programming systems that have already been proposed.
Abstract: Classical fixpoint semantics for logic programs is based on the T P immediate consequence operator. The Kripke/Kleene, three-valued, semantics uses Φ P , which extends T P to Kleene's strong three-valued logic. Both these approaches generalize to cover logic programming systems based on a wide class of logics, provided only that the underlying structure be that of a bilattice. This was presented in earlier papers. Recently well-founded semantics has become influential for classical logic programs. We show how the well-founded approach also extends naturally to the same family of bilattice-based programming languages that the earlier fixpoint approaches extended to. Doing so provides a natural semantics for logic programming systems that have already been proposed, as well as for a large number that are of only theoretical interest. And finally, doing so simplifies the proofs of basic results about the well-founded semantics, by stripping away inessential details.

Book ChapterDOI
01 Jun 1991
TL;DR: A new invariant semantics of concurrent systems is introduced which is a direct generalisation of the causal partial order semantics and overcomes some of the problems encountered when one uses causal partial orders alone.
Abstract: We introduce a new invariant semantics of concurrent systems which is a direct generalisation of the causal partial order semantics. Our new semantics overcomes some of the problems encountered when one uses causal partial orders alone. We discuss various aspects of the new invariant model. In particular, we outline how the new invariants can be generated by 1-safe inhibitor Petri nets.

Book ChapterDOI
25 Mar 1991
TL;DR: It is shown how any ordering on programs for which these basic theorems hold can be easily extended to give a fully abstract cpo for the language, giving evidence that any operational semantics with these basicTheorems proven is complete with respect to a denotational semantics.
Abstract: In this paper operational equivalence of simple functional programs is defined, and certain basic theorems proved thereupon. These basic theorems include congruence, least fixed-point, an analogue to continuity, and fixed-point induction. We then show how any ordering on programs for which these theorems hold can be easily extended to give a fully abstract cpo for the language, giving evidence that any operational semantics with these basic theorems proven is complete with respect to a denotational semantics. Furthermore, the mathematical tools used in the paper are minimal, the techniques should be applicable to a wide class of languages, and all proofs are constructive.

Journal ArticleDOI
TL;DR: A self-contained account of a calculus of relations from basic operations through the treatment of recursive relation equations, developed in the framework of set theory, which may be regarded as a systematic generalization of the functional style.
Abstract: The paper gives a self-contained account of a calculus of relations from basic operations through the treatment of recursive relation equations. This calculus serves as an algebraic apparatus for defining the denotational semantics of Dijkstra’s nondeterministic sequential programming language. Nondeterministic programs are modeled by binary relations, objects of an algebraic structure founded upon the operations “union”, “left restriction”, “demonic composition”, “demonic union”, and the ordering “restriction of”. Recursion and iteration are interpreted as fixed points of continuous relationals. Developed in the framework of set theory, this calculus may be regarded as a systematic generalization of the functional style.