scispace - formally typeset
Search or ask a question

Showing papers on "Operational semantics published in 1996"


Journal ArticleDOI
TL;DR: The semantics of statecharts as implemented in the STATEMATE system is described, which was the first executable semantics defined for the language and has been in use for almost a decade.
Abstract: We describe the semantics of statecharts as implemented in the STATEMATE system. This was the first executable semantics defined for the language and has been in use for almost a decade. In terms of the controversy around whether changes made in a given step should take effect in the current step or in the next one, this semantics adopts the latter approach.

1,139 citations


Book
01 Jan 1996
TL;DR: The language PCF as mentioned in this paper is a model-programming language for lambda xxx with type variables type inference with polymorphic declarations, and it can be seen as a generalization of the notion of simply typed lambda calculus.
Abstract: Part 1 Introduction: model programming languages lambda notation equations, reduction and semantics types and type systems notation and mathematical conventions set-theoretic background syntax and semantics induction. Part 2 The language PCF: syntax of PCF PCF programmes and their semantics PCF reduction and symbolic interpreters PCF programming examples, expressive power and limitations variations and extensions of PCF. Part 3 Universal algebra and algebraic data types: preview of algebraic specification algebras, signatures and terms equations, soundness and completeness homomorphisms and initiality algebraic data types rewrite systems. Part 4 Simply-typed lambda calculus: types terms proof systems Henkin models, soundness and completeness. Part 5 Models of typed lambda calculus: domain-theoretic models and fixed points fixed-point induction computational adequacy and full abstraction recursion-theoretic models partial equivalence relations and recursion. Part 6 Imperative programmes: while programmes operational semantics denotational semantics before-after assertions about while programmes semantics of additional programme constructs. Part 7 Categories and recursive types: Cartesian closed categories Kripke lambda models and functor categories domain models of recursive types. Part 8 Logical relations: introduction to logical relations logical relations over applicative structures proof-theoretic results partial surjections and specific models representation independence generalizations of logical relations. Part 9 Polymorphism and modularity: predicative polymorphic calculus impredicative polymorphism data abstraction and existential types general products, sums and programme modules. Part 10 subtyping and related concepts: simply typed lambda calculus with subtyping records, semantic models of subtyping recursive types and a record model of objects polymorphism with subtype constraints. Part 11 Type inference: introduction to type inference type inference for lambda xxx with type variables type inference with polymorphic declarations.

603 citations


Proceedings Article
03 Sep 1996
TL;DR: This work proposes a unifying model that enables a uniform description of the problem of discovering association rules, and provides SQL-like operator, named MINE RULE, which is capable of expressing all the problems presented so far in the literature concerning the mining of association rules.
Abstract: Data mining evolved as a collection of applicative problems and efficient solution algorithms relative to rather peculiar problems, all focused on the discovery of relevant information hidden in databases of huge dimensions. In particular, one of the most investigated topics is the discovery of association rules. This work proposes a unifying model that enables a uniform description of the problem of discovering association rules. The model provides SQL-like operator, named MINE RULE, which is capable of expressing all the problems presented so far in the literature concerning the mining of association rules. We demonstrate the expressive power of the new operator by means of several examples, some of which are classical, while some others are fully original and correspond to novel and unusual applications. We also present the operational semantics of the operator by means of an extended relational algebra.

346 citations


Journal ArticleDOI
TL;DR: In this paper, the authors extend Milner's language of types by distinguishing between the ability to read from a channel, ability to write to a channel and the ability both to read and to write.
Abstract: The π-calculus is a process algebra that supports mobility by focusing on the communication of channels. Milner's presentation of the π-calculus includes a type system assigning arities to channels and enforcing a corresponding discipline in their use. We extend Milner's language of types by distinguishing between the ability to read from a channel, the ability to write to a channel, and the ability both to read and to write. This refinement gives rise to a natural subtype relation similar to those studied in typed λ-calculi. The greater precision of our type discipline yields stronger versions of standard theorems on the π-calculus. These can be used, for example, to obtain the validity of β-reduction for the more efficient of Milner's encodings of the call-by-value λ-calculus, which fails in the ordinary π-calculus. We define the syntax, typing, subtyping, and operational semantics of our calculus, prove that the typing rules are sound, apply the system to Milner's λ-calculus encodings, and sketch extensions to higher-order process calculi and polymorphic typing.

316 citations


Journal ArticleDOI
Douglas J. Howe1
TL;DR: This work uses this method to show that some generalizations of Abramsky's applicative bisimulation are congruences whenever evaluation can be specified by a certain natural form of structured operational semantics.
Abstract: We give a method for proving congruence of bisimulation-like equivalences in functional programming languages. The method applies to languages that can be presented as a set of expressions together with an evaluation relation. We use this method to show that some generalizations of Abramsky's applicative bisimulation are congruences whenever evaluation can be specified by a certain natural form of structured operational semantics. One of the generalizations handles nondeterminism and diverging computations.

267 citations


Book
22 May 1996
TL;DR: Algebraic Semantics of Imperative Programs presents a self-contained and novel "executable" introduction to formal reasoning about imperative programs, to improve programming ability by improving intuition about what programs mean and how they run.
Abstract: From the Publisher: Algebraic Semantics of Imperative Programs presents a self-contained and novel "executable" introduction to formal reasoning about imperative programs. The authors' primary goal is to improve programming ability by improving intuition about what programs mean and how they run. The semantics of imperative programs is specified in a formal, implemented notation, the language OBJ; this makes the semantics highly rigorous yet simple, and provides support for the mechanical verification of program properties. OBJ was designed for algebraic semantics; its declarations introduce symbols for sorts and functions, its statements are equations, and its computations are equational proofs. Thus, an OBJ "program" is an equational theory, and every OBJ computation proves some theorem about such a theory. This means that an OBJ program used for defining the semantics of a program already has a precise mathematical meaning. Moreover, standard techniques for mechanizing equational reasoning can be used for verifying axioms that describe the effect of imperative programs on abstract machines. These axioms can then be used in mechanical proofs of properties of programs. Intended for advanced undergraduates or beginning graduate students, Algebraic Semantics of Imperative Programs contains many examples and exercises in program verification, all of which can be done in OBJ.

202 citations


Journal ArticleDOI
TL;DR: It is shown that partial real numbers can be considered as “continuous words” and it is proved that the operational semantics is sound and complete with respect to the denotational semantics.

182 citations


Journal ArticleDOI
TL;DR: This paper presents Forum, a logic programming presentation of all of linear logic that modularly extends λProlog, Lolli, and LO, and specifies in it a sequent calculus proof system and the operational semantics of a programming language that incorporates references and concurrency.

166 citations


Proceedings ArticleDOI
15 Jun 1996
TL;DR: It is shown that it is possible to extend the class of static synchronous data-flow to higher order and dynamical networks, thus giving sense to a larger class of synchronousData-flow networks.
Abstract: Synchronous data-flow is a programming paradigm which has been successfully applied in reactive systems. In this context, it can be characterized as some class of static bounded memory data-flow networks. In particular, these networks are not recursively defined, and obey some kind of "synchronous" constraints (clock calculus). Based on Kahn's relationship between data-flow and stream functions, the synchronous constraints can be related to Wadler's listlessness, and can be seen as sufficient conditions ensuring listless evaluation. As a by-product, those networks enjoy efficient compiling techniques. In this paper, we show that it is possible to extend the class of static synchronous data-flow to higher order and dynamical networks, thus giving sense to a larger class of synchronous data-flow networks.This is done by extending both the synchronous operational semantics, the clock calculus and the compiling technique of static data-flow networks, to these more general networks.

156 citations


Proceedings ArticleDOI
27 Jul 1996
TL;DR: The linear type theory LLF is presented as the formal basis for a conservative extension of the LF logical framework and can be given an operational interpretation as a logic programming language under which the representations above can be used for type inference, evaluation and cut-elimination.
Abstract: We present the linear type theory LLF as the formal basis for a conservative extension of the LF logical framework. LLF combines the expressive power of dependent types with linear logic to permit the natural and concise representation of a whole new class of deductive systems, namely those dealing with state. As an example we encode a version of Mini-ML with references including its type system, its operational semantics, and a proof of type preservation. Another example is the encoding of a sequent calculus for classical linear logic and its cut elimination theorem. LLF can also be given an operational interpretation as a logic programming language under which the representations above can be used for type inference, evaluation and cut-elimination.

148 citations


Journal ArticleDOI
TL;DR: A complete treatment of hiding, along with a detailed treatment of the model, is presented, which shows that the model is expressive by defining combinators from the synchronous languages and supports the properties of multiform time, orthogonal pre-emption and executable specifications.


Book ChapterDOI
27 Jul 1996
TL;DR: This paper extends the needed narrowing strategy to higher-order functions and λ-terms as data structures and is the first calculus for higher- order functional logic programming which provides for such an optimality result.
Abstract: Functional logic languages with a sound and complete operational semantics are mainly based on narrowing. Due to the huge search space of simple narrowing, steadily improved narrowing strategies have been developed in the past. Needed narrowing is currently the best narrowing strategy for first-order functional logic programs due to its optimality properties w.r.t. the length of derivations and the number of computed solutions. In this paper, we extend the needed narrowing strategy to higher-order functions and λ-terms as data structures. By the use of definitional trees, our strategy computes only incomparable solutions. Thus, it is the first calculus for higher-order functional logic programming which provides for such an optimality result. Since we allow higher-order logical variables denoting λ-terms, applications go beyond current functional and logic programming languages.

Journal ArticleDOI
TL;DR: SL is a new programming language of the synchronous reactive family in which hypotheses about signal presence/absence are disallowed and sources of causal circularities are avoided, while only weak preemption remains.
Abstract: We present SL, a new programming language of the synchronous reactive family in which hypotheses about signal presence/absence are disallowed. One can decide that a signal is absent during an instant only at the end of this instant, and so reaction to this absence is delayed to the next instant. Sources of causal circularities are avoided, while only weak preemption remains. A structural operational semantics is provided through rewrite rules, and an implementation is described. In addition to directly executing programs, this implementation can also be used to produce automata by symbolic evaluation.

Book ChapterDOI
26 Aug 1996
TL;DR: A new model for message-passing processes is proposed which generalizes the notion of symbolic transition graph as introduced in [HL95], by allowing assignments to be carried in transitions, in terms of the greatest solutions of predicate equation systems.
Abstract: A new model for message-passing processes is proposed which generalizes the notion of symbolic transition graph as introduced in [HL95], by allowing assignments to be carried in transitions. The main advantage of this generalization is that a wider class of processes can be represented as finite state graphs. Two kinds of operational semantics, ground and symbolic, are given to such graphs. On top of them both ground and symbolic bisimulations are defined and are shown to agree with each other. An algorithm is also presented which computes bisimulation formulae for finite state symbolic transition graphs with assignments, in terms of the greatest solutions of predicate equation systems.

01 Jan 1996
TL;DR: This thesis investigates the extension of programming languages to support the notion of physical dimension by presenting a type system similar to that of the programming language ML but extended with polymorphic dimension types.
Abstract: Scientists and engineers must ensure that the equations and formulae which they use are dimensionally consistent, but existing programming languages treat all numeric values as dimensionless. This thesis investigates the extension of programming languages to support the notion of physical dimension. A type system is presented similar to that of the programming language ML but extended with polymorphic dimension types. An algorithm which infers most general dimension types automatically is then described and proved correct. The semantics of the language is given by a translation into an explicitlytyped language in which dimensions are passed as arguments to functions. The operational semantics of this language is specified in the usual way by an evaluation relation defined by a set of rules. This is used to show that if a program is well-typed then no dimension errors can occur during its evaluation. More abstract properties of the language are investigated using a denotational semantics: these include a notion of invariance under changes in the units of measure used, analogous to parametricity in the polymorphic lambda calculus. Finally the dissertation is summarised and many possible directions for future research in dimension types and related type systems are described.

Journal ArticleDOI
TL;DR: The weak versions of Apt/Blair/Walker's stratified semantics M supp P and of Van Gelder/Ross/Schlipf's well-founded semantics WFS are investigated and it is shown that credulous entailment for both semantics is NP-complete (consequently, sceptical entailment is co-NP-complete).
Abstract: It is wellknown that Minker's semantics GCWA for positive disjunctive programs P, i.e. to decide if a literal is true in all minimal models of P is Π P 2-complete. This is in contrast to the same entailment problem for semantics of non-disjunctive programs such as STABLE and SUPPORTED (both are co-NP-complete) as well as M supp P and WFS (that are even polynomial). Recently, the idea of reducing disjunctive to non-disjunctive programs by using so called shift-operations was introduced independently by Bonatti, Dix/Gottlob/Marek, and Schaerf. In fact, Schaerf associated to each semantics SEM for normal programs a corresponding semantics Weak-SEM for disjunctive programs and asked for the properties of these weak semantics, in particular for the complexity of their entailment relations. While Schaerf concentrated on Weak-STABLE and Weak-SUPPORTED, we investigate the weak versions of Apt/Blair/Walker's stratified semantics M supp P and of Van Gelder/Ross/Schlipf's well-founded semantics WFS. We show that credulous entailment for both semantics is NP-complete (consequently, sceptical entailment is co-NP-complete). Thus, unlike GCWA, the complexity of these semantics belongs to the first level of the polynomial hierarchy. Note that, unlike Weak-WFS, the semantics Weak-M supp P is not always defined: testing consistency of Weak-M supp P is also NP-complete. We also show that Weak-WFS and Weak-M supp P are cumulative and rational and that., in addition, Weak-WFS satisfies some of the well-behaved principles introduced by Dix.

Journal ArticleDOI
TL;DR: This work sketches the background for the development of action semantics, summarizes the main ideas of the framework, and provides a simple illustrative example of an ASD, and identifies which features of ASDs are crucial for good pragmatics.
Abstract: Action Semantics is a framework for the formal description of programming languages. Its main advantage over other frameworks is pragmatic: action-semantic descriptions (ASDs) scale up smoothly to realistic programming languages. This is due to the inherent extensibility and modifiability of ASDs, ensuring that extensions and changes to the described language require only proportionate changes in its description. (In denotational or operational semantics, adding an unforeseen construct to a language may require a reformulation of the entire description.) After sketching the background for the development of action semantics, we summarize the main ideas of the framework, and provide a simple illustrative example of an ASD. We identify which features of ASDs are crucial for good pragmatics. Then we explain the foundations of action semantics, and survey recent advances in its theory and practical applications. Finally, we assess the prospects for further development and use of action semantics. The action semantics framework was initially developed at the University of Aarhus by the present author, in collaboration with David Watt (University of Glasgow). Groups and individuals scattered around five continents have since contributed to its theory and practice.

Book ChapterDOI
01 Feb 1996
TL;DR: A novel approach based on a notion of objects and characterize them in terms of their observable behavior is proposed, which leads to considerable accuracy in the semantic modelling of locality and single-threadedness properties of objects.
Abstract: Semantics of imperative programming languages is traditionally described in terms of functions on global states. We propose here a novel approach based on a notion of objects and characterize them in terms of their observable behavior. States are regarded as part of the internal structure of objects and play no role in the observable behavior. It is shown that this leads to considerable accuracy in the semantic modelling of locality and single-threadedness properties of objects.

Book ChapterDOI
23 Sep 1996
TL;DR: This work presents its own state-oriented logical approach to active rules which combines the declarative semantics of deductive rules with the possibility to define updates in the style of production rules and active rules.
Abstract: After briefly reviewing the basic notions and terminology of active rules and relating them to production rules and deductive rules, respectively, we survey a number of formal approaches to active rules. Subsequently, we present our own state-oriented logical approach to active rules which combines the declarative semantics of deductive rules with the possibility to define updates in the style of production rules and active rules. The resulting language Statelog is surprisingly simple, yet captures many features of active rules including composite event detection and different coupling modes. Thus, it can be used for the formal analysis of rule properties like termination and expressive power. Finally, we show how nested transactions can be modeled in Statelog, both from the operational and the model-theoretic perspective.

Journal ArticleDOI
TL;DR: domain theoretic concepts are extended to include concepts from domain theory, including the notions of directed set, least upper bound, complete partial order, monotonicity, continuity, finite element,?-algebraicity, full abstraction, and least fixed point properties, and are used to construct a (strongly) fully abstract continuous model for the authors' language.
Abstract: This paper builds domain theoretic concepts upon an operational foundation. The basic operational theory consists of a single step reduction system from which an operational ordering and equivalence on programs are defined. The theory is then extended to include concepts from domain theory, including the notions of directed set, least upper bound, complete partial order, monotonicity, continuity, finite element,?-algebraicity, full abstraction, and least fixed point properties. We conclude by using these concepts to construct a (strongly) fully abstract continuous model for our language. In addition we generalize a result of Milner and prove the uniqueness of such models.

Book ChapterDOI
22 Apr 1996
TL;DR: In this article, a distinction is drawn between denotational and operational semantics, and a desideratum of Intensional Semantics is proposed, interpolating between the two types of semantics as traditionally conceived.
Abstract: The “classical” paradigm for denotational semantics models data types as domains, i.e. structured sets of some kind, and programs as (suitable) functions between domains. The semantic universe in which the denotational modelling is carried out is thus a category with domains as objects, functions as morphisms, and composition of morphisms given by function composition. A sharp distinction is then drawn between denotational and operational semantics. Denotational semantics is often referred to as “mathematical semantics” because it exhibits a high degree of mathematical structure; this is in part achieved by the fact that denotational semantics abstracts away from the dynamics of computation—from time. By contrast, operational semantics is formulated in terms of the syntax of the language being modelled; it is highly intensional in character; and it is capable of expressing the dynamical aspects of computation. The classical denotational paradigm has been very successful, but has some definite limitations. Firstly, fine-structural features of computation, such as sequentiality, computational complexity, and optimality of reduction strategies, have either not been captured at all denotationally, or not in a fully satisfactory fashion. Moreover, once languages with features beyond the purely functional are considered, the appropriateness of modelling programs by functions is increasingly open to question. Neither concurrency nor “advanced” imperative features such as local references have been captured denotationally in a fully convincing fashion. This analysis suggests a desideratum of Intensional Semantics, interpolating between denotational and operational semantics as traditionally conceived. This should combine the good mathematical structural properties of denotational semantics with the ability to capture dynamical aspects and to embody computational intuitions of operational semantics. Thus we may think of Intensional semantics as “Denotational semantics + time (dynamics)”, or as “Syntax-free operational semantics”. A number of recent developments (and, with hindsight, some older ones) can be seen as contributing to this goal of Intensional Semantics. We will focus on the recent work on Game semantics, which has led to some striking advances in the Full Abstraction problem for PCF and other programming languages (Abramsky et al. 1995) (Abramsky and McCusker 1995) (Hyland and Ong 1995) (McCusker 1996a) (Ong 1996). Our aim is to give a genuinely elementary first introduction; we therefore present a simplified version of game semantics, which nonetheless

Journal ArticleDOI
TL;DR: An efficient implementation of goal-oriented effective query evaluation under the well-founded semantics that produces a residual program for subgoals that are relevant to a query, which contains facts for true instances and clauses with body literals for undefined instances.
Abstract: The well-founded semantics and the stable model semantics capture intuitions of the skeptical and credulous semantics in nonmonotonic reasoning, respectively. They represent the two dominant proposals for the declarative semantics of deductive databases and logic programs. However, neither semantics seems to be suitable for all applications. We have developed an efficient implementation of goal-oriented effective query evaluation under the well-founded semantics. It produces a residual program for subgoals that are relevant to a query, which contains facts for true instances and clauses with body literals for undefined instances. We present a simple method of stable model computation that can be applied to the residual program of a query to derive answers with respect to stable models. The method incorporates both forward and backward chaining to propagate the assumed truth values of ground atoms, and derives multiple stable models through backtracking. Users are able to request that only stable models satisfying certain conditions be computed. A prototype has been developed that provides integrated query evaluation under the well-founded semantics, the stable models, and ordinary Prolog execution. We describe the user interface of the prototype and present some experimental results.

Journal ArticleDOI
TL;DR: A strategy language is developed whose operational semantics is also based on rewriting and is described in ELAN, a language based on computational systems that are simply rewriting theories controlled by strategies.

Book
01 Jan 1996
TL;DR: This book discusses Sequential vs Parallel Systems, Semantics of Sequential Programs, Operational Semantics and Fairness, and Some Proofs and Solutions.
Abstract: 1. Introduction. 2. Basic Mathematical Concepts. 3. Semantics of Sequential Programs. 4. Sequential vs Parallel Systems. 5. Control Programs and Petri Nets. 6. Operational Semantics and Fairness. 7. Programs with Shared Data. 8. Communicating Programs. 9. Some Proofs and Solutions.

Book ChapterDOI
25 Sep 1996
TL;DR: A new BAN-like logic and a new formal semantics for logics of authentication that is able to handle most kinds of protocols used in practice and able to detect flaws in previous logics is presented.
Abstract: We present a new BAN-like logic and a new formal semantics for logics of authentication. The main focus of this paper is on the foundation of this logic by a possible-worlds semantics. The logic was designed for implementation in the tool AUTLOG and is able to handle most kinds of protocols used in practice. The underlying logic is a K45-logic, including negation. We replace the critical idealization step by changing the set of premises. The formal semantics enables us to detect flaws in previous logics. We apply the logic to a new authentication protocol designed for UMTS.

Book ChapterDOI
27 Jul 1996
TL;DR: A parametric logical relation between the phrases of an Algol-like language is presented and provides an applicative characterisation of contextual equivalence for the language and provides a useful (and complete) method for proving equivalences.
Abstract: A parametric logical relation between the phrases of an Algol-like language is presented. Its definition involves the structural operational semantics of the language, but was inspired by recent denotationally-based work of O'Hearn and Reynolds on translating Algol into a predicatively polymorphic linear lambda calculus. The logical relation yields an applicative characterisation of contextual equivalence for the language and provides a useful (and complete) method for proving equivalences. Its utility is illustrated by giving simple and direct proofs of some contextual equivalences, including an interesting equivalence due to O'Hearn which hinges upon the undefinability of 'snapback' operations (and which goes beyond the standard suite of 'Meyer-Sieber' examples). Whilst some of the mathematical intricacies of denotational semantics are avoided, the hard work in this operational approach lies in establishing the 'fundamental property' for the logical relation-the proof of which makes use of a compactness property of fixpoint recursion with respect to evaluation of phrases. But once this property has been established, the logical relation provides a verification method with an attractively low mathematical overhead.

Book ChapterDOI
26 Aug 1996
TL;DR: This paper presents an imperative and concurrent extension of the functional object-oriented calculus, which is the first concurrent object calculus to be studied, and presents a subject reduction theorem, modified to account for imperative and concurrently features, and type and effect soundness theorems.
Abstract: This paper presents an imperative and concurrent extension of the functional object-oriented calculus described in [FHM94]. It belongs to the family of so-called prototype-based object-oriented languages, in which objects are created from existing ones via the inheritance primitives of object extension and method override. Concurrency is introduced through the identification of objects and processes. To our knowledge, the resulting calculus is the first concurrent object calculus to be studied. We define an operational semantics for the calculus via a transition relation between configurations, which represent snapshots of the run-time system. Our static analysis includes a type inference system, which statically detects message-not-understood errors, and an effect system, which guarantees that synchronization code, specified via guards, is side-effect free. We present a subject reduction theorem, modified to account for imperative and concurrent features, and type and effect soundness theorems.

Proceedings ArticleDOI
15 Jun 1996
TL;DR: This papes builds upon John Reppy's reduction semantics for CML by constructing a compositional operational semantics for a fragment of CML, based on higher-order process algebra, to build a semantic theory for C ML based on weak bisimulation equivalence.
Abstract: Concurrent ML (CML) is an extension of Standard ML of New Jersey with concurrent features similar to those of process algebra. In this papes we build upon John Reppy's reduction semantics for CML by constructing a compositional operational semantics for a fragment of CML, based on higher-order process algebra. We use this to build a semantic theory for CML, based on weak bisimulation equivalence. We give some small examples of proofs about CML expressions, and show that our semantics corresponds to Reppy's up to weak first-order bisimulation.

Book ChapterDOI
22 Apr 1996