scispace - formally typeset
Search or ask a question

Showing papers on "Denotational semantics published in 1982"


Book ChapterDOI
12 Jul 1982
TL;DR: This paper discusses many examples in an informal way that should serve as an introduction to the theory of domains and proves many things that were done previously axiomatically can now be proved in a straightfoward way as theorems.
Abstract: The purpose of the theory of domains is to give models for spaces on which to define computable functions. The kinds of spaces needed for denotational sematics involve not only spaces of higher type (e.g. function spaces) but also spaces defined recursively (e.g. reflexive domains). Also required are many special domain constructs (or functors) in order to create the desired structures. There are several choices of a suitable category of domains, but the basic one which has the simplest properties is the one sometimes called consistently complete algebraic cpo's. This category of domains is studied in this paper from a new, and it is to be hoped, simpler point of view incorporating the approaches of many authors into a unified presentation. Briefly, the domains of elements are represented set theoretically with the aid of structures called information systems. These systems are very familiar from mathematical logic, and their use seems to accord well with intuition. Many things that were done previously axiomatically can now be proved in a straightfoward way as theorems. The present paper discusses many examples in an informal way that should serve as an introduction to the subject.

698 citations


Journal ArticleDOI
TL;DR: The purpose of the present paper is to set up a categorical framework in which the known techniques for solving equations find a natural place, generalizing from least fixed-points of continuous functions over cpos to initial ones of continuous functors over $\omega $-categories.
Abstract: Recursive specifications of domains plays a crucial role in denotational semantics as developed by Scott and Strachey and their followers. The purpose of the present paper is to set up a categorical framework in which the known techniques for solving these equations find a natural place. The idea is to follow the well-known analogy between partial orders and categories, generalizing from least fixed-points of continuous functions over cpos to initial ones of continuous functors over $\omega $-categories. To apply these general ideas we introduce Wand’s ${\bf O}$-categories where the morphism-sets have a partial order structure and which include almost all the categories occurring in semantics. The idea is to find solutions in a derived category of embeddings and we give order-theoretic conditions which are easy to verify and which imply the needed categorical ones. The main tool is a very general form of the limit-colimit coincidence remarked by Scott. In the concluding section we outline how compatibilit...

564 citations


Journal ArticleDOI
TL;DR: This paper presents a novel account of the syntax and semantics of questions, making use of the framework for linguistic description developed by Richard Montague (1974), and how the proposed description compares with recent transformational analyses.
Abstract: 0. This paper presents a novel account of the syntax and semantics of questions, making use of the framework for linguistic description developed by Richard Montague (1974). Certain features of the proposal are based on work by N. Belnap (1963), L. Aqvist (1965), C. L. Baker (1968, 1970), S. Kuno and J. Robinson (1972), C. L. Hamblin (1973), E. Keenan and R. Hull (1973), J. Hintikka (1974), Lewis (1975), and D. Wunderlich (1975), but it differs from all of its predecessors in one way or another. I will start with a number of observations which provide the basis for the treatment of questions presented in the second part of the paper and conclude with a summary and a brief discussion of how the proposed description compares with recent transformational analyses.

327 citations


Journal ArticleDOI
TL;DR: A framework allowing a unified and rigorous definition of the semantics of concurrency is proposed, which introduces processes as elements of process domains which are obtained as solutions of domain equations in the sense of Scott and Plotkin.
Abstract: A framework allowing a unified and rigorous definition of the semantics of concurrency is proposed. The mathematical model introduces processes as elements of process domains which are obtained as solutions of domain equations in the sense of Scott and Plotkin. Techniques of metric topology as proposed, e.g., by Nivat are used to solve such equations. Processes are then used as meanings of statements in languages with concurrency. Three main concepts are treated, viz. parallellism (arbitrary interleaving of sequences of elementary actions), synchronization, and communication. These notions are embedded in languages which also feature classical sequential concepts such as assignment, tests, iteration or recursion, and guarded commands. In the definitions, a sequence of process domains of increasing complexity is used. The languages discussed include Milner's calculus for communicating systems and Hoare's communicating sequential processes. The paper concludes with a section with brief remarks on miscellaneous notions in concurrency, and two appendices with mathematical details.

323 citations


Book ChapterDOI
Glynn Winskel1
12 Jul 1982
TL;DR: This work gives denotational semantics to a wide range of parallel programming languages based on the idea of Milner’s CCS, that processes communicate by events of mutual synchronization, and gets an event structure semantics for CCS.
Abstract: We give denotational semantics to a wide range of parallel programming languages based on the idea of Milner’s CCS [Mil80a], that processes communicate by events of mutual synchronization. Processes are denoted by labeled event structures. Event structures represent concurrency rather directly, as in net theory [Bra80]. The semantics does not simulate concurrency by non-deterministic interleaving. We first define a category E of event structures [NPW79, NPW81, Win80] appropriate to synchronized communication. The category bears a natural relation to a subcategory of trees though an interleaving functor; so results transfer to trees neatly. Then we introduce the concept of a synchronization algebra (S.A.) on labels by adopting an idea of Milner [Mil80b]. An S.A. specifies how two processes synchronize via labels on their events. From each S.A., L, we derive a category EL of labeled event structures with natural operations for composing labeled event structures. In particular the parallel composition L is derived from the product in E. We obtain semantics for a class of CCS-like languages by varying the S.A.. Synchronization algebras are very general so the class is very broad, handling synchrony and asynchrony in a common framework. As a corollary we get an event structure semantics for CCS. When interleaved our semantics is Milner’s synchronization/communication tree semantics [Mil80a]. However our semantics distinguishes more terms as it reflects concurrency. Event structure semantics is at a rather basic level of abstraction but should support all abstract notions of equivalence (see [Mil80a] for examples), including those which take concurrency into account.

273 citations


Book
01 Oct 1982
TL;DR: Buku ini merupakan kumpulan makalah hasil konferensi tentang linguistik semantik di Universitas Columbia as mentioned in this paper.
Abstract: Buku ini merupakan kumpulan makalah hasil konferensi tentang linguistik semantik di Universitas Columbia. Sementara persoalan-persoalan yang dikaji antara lain presupposition dalam filsafat dan linguistik oleh Richard Garner, peran deduksi dalam kajian gammar oleh George Lakoff, tentang konjungsi oleh Robin Lakoff, tentang kajian deskripsi semantik oleh Filmore, struktur-dalam oleh Annear Thompson, dll. Aspek yang dikaji dalam buku ini sangatlah luas namun terbatas pada kajian semantis yang terungkap dalam struktur kata, grammar, maupun kalimat.

229 citations


Journal ArticleDOI
TL;DR: It is proved, that differentiating according to the functional level of recursion leads to two infinite hierarchies of recursive languages, the IO - and OI-hierarchies, which can be characterized as canonical extensions of the regular, context-free, and IO-and-macro languages, respectively.

208 citations


Journal ArticleDOI
TL;DR: A sequential denotational semantics for sequential programming languages is provided, based on a new notion of sequential algorithm on the Kahn-Plotkin concrete data structures, which form a cartesian closed category with straightforward solutions to recursive domain equations.

197 citations


Book ChapterDOI
01 Jan 1982
TL;DR: These notes were originally written for lectures on the semantics of programming languages delivered at Oxford during Michaelmas Term 1980 to provide the foundations needed for the method of denotational semantics.
Abstract: These notes were originally written for lectures on the semantics of programming languages delivered at Oxford during Michaelmas Term 1980 The purpose of the course was to provide the foundations needed for the method of denotational semantics; in particular I wanted to make the connections with recursive function theory more definite and to show how to obtain explicit, effectively given solutions to domain equations Roughly, these chapters cover the first half of the book by Stoy, and he was able to continue the lectures the next term discussing semantical concepts following his text

176 citations


Book
01 Oct 1982
TL;DR: This chapter discusses the construction of Routines, a type of programming language, and some of the techniques used to design and implement these systems.
Abstract: 0.1 On the Etymology of the Word Algorithm.- 0.2 How Algorithms are Characterized.- 0.3 Programming as an Evolutionary Process.- 0.4 How to Solve it.- 1. Routines.- 1.1 The Parameter Concept.- 1.2 Declaration of a Routine.- 1.3 Hierarchical Construction of Routines.- 1.3.1 Primitive Routines and Computational Structures.- 1.3.2 The Principle of Substitution.- 1.3.3 Alternatives.- 1.3.4 Input/Output.- 1.4 Recursive Routines and Systems.- 1.4.1 Examples.- 1.4.2 Proof of Termination.- 1.4.3 Taxonomy of Recursion.- 1.4.4 The Level of Applicative Formulation.- 1.5 Mathematical Semantics: Fixpoint Theory.- 1.5.1 Recursive Routines and Functional Equations.- 1.5.2 Fixpoint Theory.- 1.6 Proofs by Induction of Properties of Routines.- 1.6.1 Computational Induction.- 1.6.2 Structural Induction.- 1.7 Operational Semantics: Machines.- 1.7.1 Unfolding and Folding.- 1.7.2 Partial Computation.- 1.7.3 Text Substitution Machines.- 1.7.4 The Stack Machine.- 1.8 Restriction of the Parameter Domain.- 1.9 Dijkstra's Guards.- 1.10 Pre-Algorithmic Formulations by Means of Choice and Determination.- 1.10.1 The Choice Operator.- 1.10.2 The Determination Operator.- 1.11 Semantics of Non-Deterministic Constructions.- 1.11.1 Pre-Algorithms and Algorithms.- 1.11.2 Deriving Algorithms from Pre-Algorithms.- 1.11.3 Mathematical Semantics of Non-Determinate Routines.- 1.11.4 Operational Semantics of Non-Deterministic Algorithms.- 1.12 Routines with a Multiple Result.- 1.13 Structuring of Routines.- 1.13.1 Structuring by Means of Abstraction and Embedding.- 1.13.2 Segments and Suppressed Parameters.- 1.13.3 Object Declarations.- 1.13.4 Result Parameters and the Actualization Taboo.- 1.14 Routines as Parameters and Results.- 1.14.1 Routines as Results.- 1.14.2 Functional Programming.- 1.14.3 The Delay Rule.- Addendum: Notations.- 2. Objects and Object Structures.- 2.1 Denotations.- 2.2 Scope of a Freely Chosen Designation.- 2.3 Kinds of Objects.- 2.4 Sets of Objects, Modes.- 2.5 Composite Modes and Objects.- 2.6 Selectors, Structures with Direct (Selector) Access.- 2.6.1 Compounds.- 2.6.2 Arrays.- 2.6.3 The Selection Structure of Compound and Array.- 2.7 Mode Variants.- 2.8 Introduction of New Modes: Summary.- 2.9 Recursive Object Structures.- 2.9.1 Definition of Recursive Object Structures.- 2.9.2 Object Diagrams.- 2.9.3 Operational Detailing of Objects.- 2.10 Algorithms with Linear Object Structures.- 2.11 The Recursive Object Structure "File".- 2.11.1 "Knitting" of Sequences.- 2.11.2 Files.- 2.12 Algorithms with Cascade-Type Object Structures.- 2.13 Traversal and Scanning of Recursive Object Structures.- 2.14 Infinite Objects.- 2.14.1 Nexuses of Objects.- 2.14.2 Lazy Evaluation.- 2.15 Some Peculiarities of Arrays.- 2.15.1 Arrays with Computed Index Bounds.- 2.15.2 Induced Operations for Arrays.- 2.16 Routines with Multiple Results Revisited.- Addendum: Notations.- 3. Computational Structures.- 3.1 Concrete Computational Structures.- 3.1.1 Encapsulation Effect.- 3.1.2 Properties of Operations.- 3.1.3 Definition of Concrete Computational Structures.- 3.1.4 Atomic Examples.- 3.2 Abstract Computational Structures and Abstract Types.- 3.2.1 Fundamental Concepts.- 3.2.2 Semantics of Abstract Computational Structures and Abstract Types.- 3.2.3 Completeness of Properties.- 3.2.4 Concretization of an Abstract Type.- 3.2.5 Notation and First Examples.- 3.2.6 Constructors and Selectors.- 3.3 Abstract Arrays.- 3.3.1 One-Side-Flexible Arrays.- 3.3.2 Two-Side-Flexible Arrays.- 3.3.3 Aggregates.- 3.4 Sequence-Type Computational Structures.- 3.4.1 Stack, Deck and Queue.- 3.4.2 Excursus: Divisibility Theory in Semi-Groups.- 3.4.3 Sequence and Word.- 3.4.4 Forgetful Functors.- 3.4.5 Sets.- 3.5 Number-Type Computational Structures.- 3.5.1 Peano Numbers.- 3.5.2 Cycle Numbers and Natural Numbers.- 3.5.3 Excursus: Extension by Means of Formal Quotients.- 3.5.4 Integers.- 3.5.5 Rational Numbers.- 3.5.6 Positional Systems and B-al-Fractions.- 3.6 Changing Abstract Types and Object Structures.- 3.6.1 Type Change and Related Types.- 3.6.2 Concretization.- 3.6.3 Implementation of Concrete Computational Structures.- 3.6.4 Example: Binarization.- 3.6.5 Example: Packing of Objects.- Addendum: Notations.- 4. Transformation into Repetitive Form.- 4.1 Schemes and Transformations.- 4.2 Treatment of Linear Recursion.- 4.2.1 The Technique of Re-Bracketing.- 4.2.2 The Technique of Operand Commutation.- 4.2.3 Function Inversion.- 4.2.4 Function Inversion According to Paterson and Hewitt.- 4.2.5 Function Inversion by Introducing Stacks.- 4.3 Treatment of Non-Linear Recursions.- 4.3.1 Method of Functional Embedding.- 4.3.2 Arithmetization of the Flow of Control.- 4.3.3 Special Cases of Nested Recursion.- 4.3.4 The Technique of Range-of-Values Tabulation.- 4.4 Disentanglement of the Control.- 4.4.1 Disentangled Routines.- 4.4.2 Disentangling Recursive Routines by Means of Function Inversion.- 4.4.3 Reshaping the Type of Control Flow.- 5. Program Variables.- 5.1 The Origin of Program Variables.- 5.1.1 Specialization of the Stack Machine.- 5.1.2 Specialization of the Range-of-Values Machine.- 5.2 Formal Introduction of Program Variables.- 5.2.1 Sequentialization of Object Declarations.- 5.2.2 Program Variables as a Means for Saving Identifiers.- 5.2.3 Expressions with Side-Effects.- 5.2.4 Complete Sequentialization of Collective Assignments.- 5.3 Procedures.- 5.3.1 Program Variables as Parameters.- 5.3.2 Actualization Taboo, Alias Ban and Suppressed Variable Parameters.- 5.3.3 Sharing of Variables.- 5.3.4 Initialization.- 5.3.5 Properties of Program Variables.- 5.4 Axiomatic Description of Programming Languages.- 5.4.1 Predicate Transformers.- 5.4.2 Program Verification.- 5.5 Variables for Structured Objects.- 5.5.1 Selective Alteration.- 5.5.2 Remarks on Input/Output.- Addendum: Notations.- 6. Control Elements.- 6.1 Deparameterization and Formal Treatment of Repetition.- 6.1.1 Deparameterization.- 6.1.2 Semantics of Repetition.- 6.1.3 Analytical Treatment of the Protocol Stack.- 6.2 Jumps.- 6.2.1 Simple Call as a Basic Control Element.- 6.2.2 Introduction of Jumps.- 6.3 The General do-od Construction.- 6.4 Loops.- 6.4.1 Rejecting and Non-Rejecting Repetition.- 6.4.2 Counted Repetition.- 6.5 Loops and Repetitive Systems.- 6.6 Sequential Circuits.- 6.7 Flow Diagrams.- 6.7.1 Classical Flow Diagrams.- 6.7.2 Splitting and Collection.- 6.7.3 Coordinated Flow Diagrams.- 6.8 Petri Nets.- 6.8.1 Theory of Petri Nets.- 6.8.2 Construction of Petri Nets, Connection to Coordinated Flow Diagrams.- 6.9 bool Petri Nets, Signals.- 6.10 nat Petri Nets, Semaphores.- Addendum: Notations.- 7. Organized Storages and Linked Lists.- 7.1 Organized Storages.- 7.1.1 Selective Updating.- 7.1.2 Collecting and Composing Variables.- 7.1.3 Computed Variables.- 7.1.4 Constructing Organized Storages and Generating Variables.- 7.1.5 Advantages and Disadvantages of Organized Storages.- 7.2 Identity of Variables and Alias Ban Revisited.- 7.2.1 Revision of the Assignment Axiom.- 7.2.2 Checking the Actualization Taboo.- 7.3 Implementing Object Structures by Organized Storages.- 7.4 Linked-List Implementation of Organized Storages.- 7.4.1 References to Variables: Pointers.- 7.4.2 Wirth's Connection.- 7.4.3 Link Variables.- 7.4.4 Implementing Computational Structures Using Linked Lists.- 7.4.5 Properties of Pointers.- 7.5 Improvement of Algorithms Working on Linked Lists by Selective Updating.- 7.5.1 Algorithms for One-Way Linked Lists.- 7.5.2 Algorithms for Two-Way Linked Lists.- 7.6 Addressing.- 7.6.1 Addresses for Variables.- 7.6.2 Jump Addresses.- 7.6.3 Genuine Addresses.- 7.6.4 Outlook to Systems Programming.- Addendum: Notations.- Conclusion. Programming as an Evolutionary Process.- Program Specification and Development in a Uniform Language.- Conceptual Organization of the Algorithmic Language.- Tools to Be Used.- Methodology of Programming.

166 citations


Journal ArticleDOI
TL;DR: By introducing notions of different homomorphisms particular models of this class can be distinguished as initial or (weakly) terminal, the concept of algebraic specification is extended to specify the semantics of programming languages in a completely abstract algebraic way as it is demonstrated for two toy languages.
Abstract: Hierarchical abstract types, where particular sorts, operations, and axioms are designated as primitive, with conditional equational formulas are studied. Introducing notions of different homomorphisms particular models of this class can be distinguished as initial or (weakly) terminal. Sufficient conditions for the existence of such models are given and their relationship to the principle of fully abstract semantics is investigated. By this the concept of algebraic specification is extended to specify the semantics of programming languages in a completely abstract algebraic way as it is demonstrated for two toy languages.

01 Jan 1982
TL;DR: A framework for handling the semantics of fully typed programming languages with imperative features, higher-order ALGOL-like procedures, block structure, and implicit conversions is created and certain functor categories are Cartesian closed and to describe a category (SIGMA) of store shapes.
Abstract: Here we create a framework for handling the semantics of fully typed programming languages with imperative features, higher-order ALGOL-like procedures, block structure, and implicit conversions. Our approach involves the introduction of a new family of programming languages, the coercive typed (lamda)-calculi, denoted by L in the body of the dissertation. By appropriately choosing the linguistic constants (i.e. generators) of L, we can view phrases of variants of ALGOL as syntactically sugared phrases of L. This dissertation breaks into three parts. In the first part, consisting of the first chapter, we supply basic definitions and motivate the idea that functor categories arise naturally in the explanation of block structure and stack discipline. The second part, consisting of the next three chapters, is dedicated to the general theory of the semantics of the coercive typed (lamda)-calculus; the interplay between posets, algebras, and Cartesian closed categories is particularly intense here. The remaining four chapters make up the final part, in which we apply the general theory to give both direct and continuation semantics for desugared variants of ALGOL. To do so, it is necessary to show certain functor categories are Cartesian closed and to describe a category (SIGMA) of store shapes. An interesting novelty in the presentation of continuation semantics is the view that commands form a procedural, rather than a primitive, phrase type.

Proceedings ArticleDOI
25 Jan 1982
TL;DR: This paper introduces a formal notition, the semanfic grammar, for defining programming languages, which handles both static and dynamic semantics, both compile and run-time actions.
Abstract: This paper introduces a formal notition, the semanfic grammar, for defining programming languages. Semantic grammars combine denotational semantics and attribute grammars. They describe syntax and wmantics together, without separate tists of formulas or rules that need to be put into correspondence. They handle both static and dynamic semantics, both compileand run-time actions. ‘Ihey describe languages at a high level of abstraction.

Proceedings ArticleDOI
05 May 1982
TL;DR: A general framework for the denotational treatment of concurrency is introduced, using tools from metric topology as advocated by Nivat to solve this equation, and illustrating the approach with the definition of a variety of concepts as encountered in the study of concurrence.
Abstract: A general framework for the denotational treatment of concurrency is introduced. The key idea is the notion of process which is element of a domain obtained as solution of a domain equation in the style as considered previously by Plotkin. We use tools from metric topology as advocated by Nivat to solve this equation, show how operations upon processes can be defined conveniently, and illustrate the approach with the definition of a variety of concepts as encountered in the study of concurrency. Only few proofs of the supporting mathematical theory are given; full proofs will appear in the final version of the paper.

Journal ArticleDOI
01 Jul 1982
TL;DR: A new approach to the formal description of programming language semantics is described and illustrated, where the values of abstract semantic algebras are taken as meanings of programs in Denotational (or Initial Algebra) Semantics, instead of using Scott domains.
Abstract: A new approach to the formal description of programming language semantics is described and illustrated. ''Abstract semantic algebras'' are just algebraically-specified abstract data types whose operations correspond to fundamental concepts of programming languages. The values of abstract semantic algebras are taken as meanings of programs in Denotational (or Initial Algebra) Semantics, instead of using Scott domains. This leads to semantic descriptions that clearly exhibit the underlying conceptual analysis, and which are rather easy to modify and extend. Some basic abstract semantic algebras corresponding to fundamental concepts of programming languages are given; they could be used in the description of many different programming languages.

Proceedings ArticleDOI
Mitchell Wand1
25 Jan 1982
TL;DR: The denotational semantics for a programming language is analyzed to obtain a compiler and a suitable target machine for the language by rewriting the equations using suitable combinators by simulating the reduction sequences for the combinator terms.
Abstract: We show how to analyze the denotational semantics for a programming language to obtain a compiler and a suitable target machine for the language. We do this by rewriting the equations using suitable combinators. The machine operates by simulating the reduction sequences for the combinator terms. The reduction sequences pass through certain standard forms, which become an architecture for the machine, and the combinators become machine instructions. Despite the abstract nature of its development, the machine greatly resembles a conventional one. The method is illustrated by a simple expression language with procedures and input-output.

Book ChapterDOI
11 Mar 1982


Proceedings ArticleDOI
05 Jul 1982
TL;DR: Test-score semantics may be viewed as a generalization of truth-conditional, possible-world and model-theoretic semantics, but its expressive power is substantially greater.
Abstract: Test-score semantics is based on the premise that almost everything that relates to natural languages is a matter of degree. Viewed from this perspective, any semantic entity in a natural language, e.g., a predicate, predicate-modifier, proposition, quantifier, command, question, etc. may be represented as a system of elastic constraints on a collection of objects or derived objects in a universe of discourse. In this sense, test-score semantics may be viewed as a generalization of truth-conditional, possible-world and model-theoretic semantics, but its expressive power is substantially greater.

Journal ArticleDOI
TL;DR: This note offers a constructive criticism of current work in the semantics of programming languages, a criticism directed not so much at the techniques and results obtained as at the use to which they are put.
Abstract: We would like in this note to offer a constructive criticism of current work in the semantics of programming languages, a criticism directed not so much at the techniques and results obtained as at the use to which they are put. The basic problem, in our view, is that denotational (or "mathematical") semantics plays on the whole a passive (we call it "descriptive") role, while operational semantics plays on the whole an active (we call it "prescriptive") role. Our suggestion is that these roles be reversed.

01 Jan 1982
TL;DR: The free occurences of x in t are bound in ( x. t), and the function that maps x to t is called the body of the abstraction.
Abstract: ion ( x. t) I The function that maps x to t. I t is called the body of the abstraction. I The free occurences of x in t are bound in ( x. t). Curryfication g(x, y) = x+ y fx(y) = x+ y g0(x) = fx g0(x)(y) = fx(y) = x+ y = g(x, y) λ-Notation -Notatation

Book ChapterDOI
01 Jan 1982
TL;DR: One of the goals of the conference of which this paper is a part is to compare the enterprise of formal semantics with that of procedure-oriented psychological semantics, which could lead to either of two outcomes: the “Separatist” position and the "Common Goals" position.
Abstract: One of the goals of the conference of which this paper is a part is to compare the enterprise of formal semantics with that of procedure-oriented psychological semantics. The former has traditionally been the domain of logicians and philosophers, the latter the domain of psychologists and computer scientists, with some linguists on each side. The problem that is evident at the outset is that semantics is treated very differently within these two enterprises, each side seemingly committed to assumptions that lead to inadequacies by the other’s criteria. A fruitful comparison could lead to either of two outcomes, which we might characterize roughly as the “Separatist” position and the “Common Goals” position.

01 Jan 1982
TL;DR: This dissertation presents the thesis that an essential ingredient for the success of efforts to incorporate more "real world" semantics into database models is a coherent theory of the semantics of time, and defines the historical database model (HDB) as a means of incorporating a temporal semantics within the relational and entity-relationship database models.
Abstract: It is difficult to imagine a successful semantic theory in which time is not an integral component. In this dissertation we examine the connection between two areas of semantics, namely the semantics of databases and the semantics of natural language, and link them together through a common view of the semantics of time. Part One presents the thesis that an essential ingredient for the success of efforts to incorporate more "real world" semantics into database models is a coherent theory of the semantics of time. We define the historical database model (HDB) as a means of incorporating a temporal semantics within the relational and entity-relationship database models. The HDB model is a very general theory, one which ascribes only the simplest and most intuitive properties to time and defines a simple relationship between time and the other elements of the database model. Part Two presents a formally defined English database query language QE-III, whose semantic theory makes explicit reference to the notion of denotation with respect to a moment of time. This language is defined as a Montague Grammar with a somewhat simplified semantic theory; it offers a natural correspondence to the interpretation of queries in a database context. QE-III is provided with a formal syntax, semantics and pragmatics, each component designed with the database application in mind. This application has motivated several extensions to the traditional conception of a Montague Grammar, extensions that are interesting in their own right: the inclusion of a formal pragmatic component, the inclusion of time-denoting expressions and temporal operators, an analysis of verb meanings into primitive meaning units derived from the database schema, and the inclusion of certain forms of direct questions.

Proceedings ArticleDOI
01 Jun 1982
TL;DR: It is argued that SIS constitutes an important first step toward the automatic construction of provably correct, complete, and reasonably efficient compilers from formal syntactic and semantic specifications.
Abstract: Compiler generation based on formal semantics has received considerable attention in recent years from a number of semanticists. Compiler writers, on the other hand, know relatively little about these efforts. This paper tries to remedy this situation by discussing our experimentation with the Semantics Implementation System (SIS) of Peter Mosses. SIS allows the user to generate a complete compiler from a formal specification of the syntax and semantics of a programming language. In particular, the translator component of a compiler is obtained by directly implementing a denotational semantics. Consequently, a compiler is expressed as a higher order function. A program is compiled by formally applying the compiler to a tree representation of the program and simplifying this application. The experimentation with SIS indicates that compiler generation based on denotational semantics is feasible, though not yet viable. We argue that SIS constitutes an important first step toward the automatic construction of provably correct, complete, and reasonably efficient compilers from formal syntactic and semantic specifications.

Proceedings ArticleDOI
01 Jun 1982
TL;DR: This paper describes the automatic generation - from the formal denotational semantic specification - of an efficient compiler's code generation phase, producing efficient code for real machines.
Abstract: We describe the automatic generation - from the formal denotational semantic specification - of an efficient compiler's code generation phase, producing efficient code for real machines. The method has been succesfully implemented and tested with languages as complex as GEDANKEN!


Proceedings ArticleDOI
01 Jun 1982
TL;DR: The development of a formal specification of the static semantics of Ada in form of an attribute grammar is described, and was tested extensively with automatically generated equivalent Pascal programs.
Abstract: We describe the development of a formal specification of the static semantics of Ada in form of an attribute grammar. This specification is complete, and was tested extensively with automatically generated equivalent Pascal programs. From this specification we systematically developed the semantic analysis part of our Ada Compiler Front End. We outline the general proceeding when specifying semantic analysis with attribute grammars and then discuss to some extent examples about declaration elaboration and overloading resolution.CR Categories and Subject Descriptors: D.3.1 [Programming Languages]: Formal definitions and Theory - semantics; D.3.4 [Programming Languages]: Processors - Translator writing systems and compiler generators; F.3.2 [Logics and Meanings of Programs]: Semantics of Programming Languages - algebraic approaches to semantics.


Journal ArticleDOI
TL;DR: The axiomatic method is used to define data model semantics, a widely accepted technique for the precise (formal) definition of programming language semantics, and an annotated, precise definition of the semantics of the structural aspects of a semantic data model which is based on the relational data model.

Dissertation
01 Apr 1982
TL;DR: This thesis proves the equivalence of an operational and a denotational semantics for pure dataflow, and uses infinite games of perfect information to give for the first time a completely general definition of subnet functionality.
Abstract: In this thesis we prove the equivalence of an operational and a denotational semantics for pure dataflow. The term pure dataflow refers to dataflow nets in which the nodes are functional (i.e., the output history is a function of the input history only) and the arcs are unbounded fifo queues. Gilles Kahn gave a method for the representation of a pure dataflow net as a set of equations; one equation for each arc in the net. Kahn stated, and we prove, that the operational behaviour of a pure dataflow net is exactly described by the least fixed point solution to the net’s associated set of equations. In our model we do not require that nodes be sequential nor deterministic, not even the functional nodes. As a consequence our model has a claim of being completely general. In particular our nets have what we call the elcapsulation property in that any subnet can be replaced in any pure dataflow context by a node having exactly the same input/output behaviour. Our model is also complete in the sense that our nodes have what we call the universality property, that is, for any continuous history function there exists a node that will compute it. The proof of the Kahn principle given in this thesis makes use of infinite games of perfect information. Infinite games turn out to be an extremely useful tool for defining and proving results about operational semantics. We use infinite games to give for the first time a completely general definition of subnet functionality. In addition their use in certain proofs is effective in reducing notational complexity. Finally we look at possible ways of extending Kahn’s denotational model by the introduction of pause objects called hiatons. Finally we describe interesting ways of refining our operational model.