scispace - formally typeset
Search or ask a question

Showing papers on "Operational semantics published in 2010"


Book
01 Jan 2010
TL;DR: Introduction 1. Historical-Philological Semantics 2. Structuralist Semantics 3. Generativist Semantics 4. NeostructuralistSemantics 5. Cognitive Semantics Conclusion References
Abstract: Introduction 1. Historical-Philological Semantics 2. Structuralist Semantics 3. Generativist Semantics 4. Neostructuralist Semantics 5. Cognitive Semantics Conclusion References

460 citations


Proceedings ArticleDOI
03 Oct 2010
TL;DR: Henshin is a new language and associated tool set for in-place transformations of EMF models using pattern-based rules on the lowest level, which can be structured into nested transformation units with well-defined operational semantics.
Abstract: The Eclipse Modeling Framework (EMF) provides modeling and code generation facilities for Java applications based on structured data models. Henshin is a new language and associated tool set for in-place transformations of EMF models. The Henshin transformation language uses pattern-based rules on the lowest level, which can be structured into nested transformation units with well-defined operational semantics. So-called amalgamation units are a special type of transformation units that provide a forall-operator for pattern replacement. For all of these concepts, Henshin offers a visual syntax, sophisticated editing functionalities, execution and analysis tools. The Henshin transformation language has its roots in attributed graph transformations, which offer a formal foundation for validation of EMF model transformations. The transformation concepts are demonstrated using two case studies: EMF model refactoring and meta-model evolution.

394 citations


Book
10 Oct 2010
TL;DR: Understanding Concurrent Systems presents a comprehensive introduction to CSP, and introduces other views of concurrency, using CSP to model and explain these, and explores the practical application of CSP.
Abstract: CSP notation has been used extensively for teaching and applying concurrency theory, ever since the publication of the text Communicating Sequential Processes by C.A.R. Hoare in 1985. Both a programming language and a specification language, the theory of CSP helps users to understand concurrent systems, and to decide whether a program meets its specification. As a member of the family of process algebras, the concepts of communication and interaction are presented in an algebraic style. An invaluable reference on the state of the art in CSP, Understanding Concurrent Systems also serves as a comprehensive introduction to the field, in addition to providing material for a number of more advanced courses. A first point of reference for anyone wanting to use CSP or learn about its theory, the book also introduces other views of concurrency, using CSP to model and explain these. The text is fully integrated with CSP-based tools such as FDR, and describes how to create new tools based on FDR. Most of the book relies on no theoretical background other than a basic knowledge of sets and sequences. Sophisticated mathematical arguments are avoided whenever possible. Topics and features: presents a comprehensive introduction to CSP; discusses the latest advances in CSP, covering topics of operational semantics, denotational models, finite observation models and infinite-behaviour models, and algebraic semantics; explores the practical application of CSP, including timed modelling, discrete modelling, parameterised verifications and the state explosion problem, and advanced topics in the use of FDR; examines the ability of CSP to describe and enable reasoning about parallel systems modelled in other paradigms; covers a broad variety of concurrent systems, including combinatorial, timed, priority-based, mobile, shared variable, statecharts, buffered and asynchronous systems; contains exercises and case studies to support the text; supplies further tools and information at the associated website: http://www.comlab.ox.ac.uk/ucs/. From undergraduate students of computer science in need of an introduction to the area, to researchers and practitioners desiring a more in-depth understanding of theory and practice of concurrent systems, this broad-ranging text/reference is essential reading for anyone interested in Hoares CSP.

348 citations


Proceedings ArticleDOI
21 Jun 2010
TL;DR: This work reduces JavaScript to a core calculus structured as a small-step operational semantics, and explicates the desugaring process that turns JavaScript programs into ones in the core.
Abstract: We reduce JavaScript to a core calculus structured as a small-step operational semantics. We present several peculiarities of the language and show that our calculus models them. We explicate the desugaring process that turns JavaScript programs into ones in the core. We demonstrate faithfulness to JavaScript using real-world test suites. Finally, we illustrate utility by defining a security property, implementing it as a type system on the core, and extending it to the full language.

239 citations


Journal ArticleDOI
TL;DR: A multithreaded functional language with session types is defined, which unifies, simplifies and extends previous work, and significantly simplifies session types in the functional setting, clarifies their essential features and provides a secure foundation for language developments such as polymorphism and object-orientation.
Abstract: Session types support a type-theoretic formulation of structured patterns of communication, so that the communication behaviour of agents in a distributed system can be verified by static typechecking. Applications include network protocols, business processes and operating system services. In this paper we define a multithreaded functional language with session types, which unifies, simplifies and extends previous work. There are four main contributions. First is an operational semantics with buffered channels, instead of the synchronous communication of previous work. Second, we prove that the session type of a channel gives an upper bound on the necessary size of the buffer. Third, session types are manipulated by means of the standard structures of a linear type theory, rather than by means of new forms of typing judgement. Fourth, a notion of subtyping, including the standard subtyping relation for session types (imported into the functional setting), and a novel form of subtyping between standard and linear function types, which allows the typechecker to handle linear types conveniently. Our new approach significantly simplifies session types in the functional setting, clarifies their essential features and provides a secure foundation for language developments such as polymorphism and object-orientation.

212 citations


Journal ArticleDOI
TL;DR: A formal operational semantics of the @w-calculus is given in terms of labeled transition systems and it is proved that the state reachability problem is decidable for finite-control @w -processes and that late bisimulation equivalence is a congruence.

114 citations


Proceedings ArticleDOI
Adam Chlipala1
17 Jan 2010
TL;DR: A verified compiler to an idealized assembly language from a small, untyped functional language with mutable references and exceptions is presented, based on a new approach to encoding operational semantics which delegates all concerns about substitution to the meta language, without using features incompatible with general-purpose type theories like Coq's logic.
Abstract: We present a verified compiler to an idealized assembly language from a small, untyped functional language with mutable references and exceptions. The compiler is programmed in the Coq proof assistant and has a proof of total correctness with respect to big-step operational semantics for the source and target languages. Compilation is staged and includes standard phases like translation to continuation-passing style and closure conversion, as well as a common subexpression elimination optimization. In this work, our focus has been on discovering and using techniques that make our proofs easy to engineer and maintain. While most programming language work with proof assistants uses very manual proof styles, all of our proofs are implemented as adaptive programs in Coq's tactic language, making it possible to reuse proofs unchanged as new language features are added.In this paper, we focus especially on phases of compilation that rearrange the structure of syntax with nested variable binders. That aspect has been a key challenge area in past compiler verification projects, with much more effort expended in the statement and proof of binder-related lemmas than is found in standard pencil-and-paper proofs. We show how to exploit the representation technique of parametric higher-order abstract syntax to avoid the need to prove any of the usual lemmas about binder manipulation, often leading to proofs that are actually shorter than their pencil-and-paper analogues. Our strategy is based on a new approach to encoding operational semantics which delegates all concerns about substitution to the meta language, without using features incompatible with general-purpose type theories like Coq's logic.

108 citations


Proceedings ArticleDOI
17 Jan 2010
TL;DR: Static typing guarantees that both sequences of messages on channels, and sequences of method calls on objects, conform to type-theoretic specifications, thus ensuring type-safety, for a small distributed class-based object-oriented language.
Abstract: Session types allow communication protocols to be specified type-theoretically so that protocol implementations can be verified by static type-checking. We extend previous work on session types for distributed object-oriented languages in three ways. (1) We attach a session type to a class definition, to specify the possible sequences of method calls. (2) We allow a session type (protocol) implementation to be modularized , i.e. partitioned into separately-callable methods. (3) We treat session-typed communication channels as objects, integrating their session types with the session types of classes. The result is an elegant unification of communication channels and their session types, distributed object-oriented programming, and a form of typestates supporting non-uniform objects, i.e. objects that dynamically change the set of available methods. We define syntax, operational semantics, a sound type system, and a correct and complete type checking algorithm for a small distributed class-based object-oriented language. Static typing guarantees that both sequences of messages on channels, and sequences of method calls on objects, conform to type-theoretic specifications, thus ensuring type-safety. The language includes expected features of session types, such as delegation, and expected features of object-oriented programming, such as encapsulation of local state. We also describe a prototype implementation as an extension of Java.

98 citations


Journal ArticleDOI
TL;DR: A formal approach to robot motion specification that takes into account three elementary behaviors that suffice to define any robot interaction with the environment, i.e. free motion, exerting generalized forces and the transition between both of these behaviors.
Abstract: In this paper we present a formal approach to robot motion specification. This motion specification takes into account three elementary behaviors that suffice to define any robot interaction with the environment, i.e. free motion, exerting generalized forces and the transition between both of these behaviors. These behaviors provide a foundation for general motion generation taking into account any sensors, any effectors and the capability to exchange information between embodied agents. This specification can be used both for the definition of robot tasks and implementation of robot control software, hence both of those aspects are presented in this paper. This formal approach was used for the implementation of the MRROC++ robot programming framework. Two-handed manipulation of a Rubik's cube is used as an exemplary task. Extensive experimentation both with the presented formalism and the MRROC++ framework showed that the imposed formal rigor eliminates many errors at the software specification phase, produces well-structured control software and significantly speeds up and simplifies its implementation. These advantages are mainly due to the fact that the proposed formal specification tool is derived from operational semantics used in computer science for the definition of programming languages, thus a close relationship between abstract definition and the implementation of the control system resulted.

84 citations


Proceedings ArticleDOI
09 Jan 2010
TL;DR: A type system that solves the open problem of context-sensitive may-happen-in-parallel analysis for languages with async-finish parallelism and proves the correctness of the type system.
Abstract: We present a core calculus with two of X10's key constructs for parallelism, namely async and finish. Our calculus forms a convenient basis for type systems and static analyses for languages with async-finish parallelism, and for tractable proofs of correctness. For example, we give a short proof of the deadlock-freedom theorem of Saraswat and Jagadeesan. Our main contribution is a type system that solves the open problem of context-sensitive may-happen-in-parallel analysis for languages with async-finish parallelism. We prove the correctness of our type system and we report experimental results of performing type inference on 13,000 lines of X10 code. Our analysis runs in polynomial time, takes a total of 28 seconds on our benchmarks, and produces a low number of false positives, which suggests that our analysis is a good basis for other analyses such as race detectors.

77 citations


Proceedings ArticleDOI
05 Jun 2010
TL;DR: This paper presents a core language with speculation constructs and mutable state and presents a formal operational semantics for the language, which uses the semantics to define the notion of a correct speculative execution as one that is equivalent to a non-speculative execution.
Abstract: Execution order constraints imposed by dependences can serialize computation, preventing parallelization of code and algorithms. Speculating on the value(s) carried by dependences is one way to break such critical dependences. Value speculation has been used effectively at a low level, by compilers and hardware. In this paper, we focus on the use of speculation by programmers as an algorithmic paradigm to parallelize seemingly sequential code.We propose two new language constructs, speculative composition and speculative iteration. These constructs enable programmers to declaratively express speculative parallelism in programs: to indicate when and how to speculate, increasing the parallelism in the program, without concerning themselves with mundane implementation details.We present a core language with speculation constructs and mutable state and present a formal operational semantics for the language. We use the semantics to define the notion of a correct speculative execution as one that is equivalent to a non-speculative execution. In general, speculation requires a runtime mechanism to undo the effects of speculative computation in the case of mis predictions. We describe a set of conditions under which such rollback can be avoided. We present a static analysis that checks if a given program satisfies these conditions. This allows us to implement speculation efficiently, without the overhead required for rollbacks.We have implemented the speculation constructs as a C# library, along with the static checker for safety. We present an empirical evaluation of the efficacy of this approach to parallelization.

Journal ArticleDOI
TL;DR: An operational semantics for a calculus of wireless systems is developed, using the calculus to describe and analyse a few properties of a version of the Alternating Bit Protocol.

Book ChapterDOI
01 Jan 2010
TL;DR: This chapter gives a detailed description of the semantics of this language, a query language called SPARQL, including all the features in the specification by the W3C such as blank nodes in graph patterns and bag semantics for solutions.
Abstract: The Resource Description Framework (RDF) is the standard data model for representing information about World Wide Web resources. In January 2008, it was released the recommendation of the W3C for querying RDF data, a query language called SPARQL. In this chapter, we give a detailed description of the semantics of this language. We start by focusing on the definition of a formal semantics for the core part of SPARQL, and then move to the definition for the entire language, including all the features in the specification of SPARQL by the W3C such as blank nodes in graph patterns and bag semantics for solutions.

DissertationDOI
03 Jun 2010
TL;DR: This thesis strives to develop verification methods specifically for aspect-oriented programming languages through an operational semantics that can be used to simulate a (partial) program and exposed the steps involved in the execution of such a program and that the resulting labelled transition systems can beused for existing verification methods.
Abstract: Aspect-oriented software development aims at improving separation of concerns at all levels in the software development life-cycle, from architecture to code implementation. In this thesis we strive to develop verification methods specifically for aspect-oriented programming languages. For this purpose, we model the behaviour of these languages through an operational semantics. We use graph transformations to specify these semantics. Graph transformation has mathematical foundation, and provides an intuitive way to describe component-based systems, such as software systems. In addition, graph transformations have an executable nature, and can be used to generate the execution state space of programs. We use these state spaces for the verification of programs with aspects. We start by defining an improvement of specification using rule-based systems. Pure rule-based systems typically consist of a single, unstructured set of rules. We propose so-called control automata, which can be added on top of pure rule-based systems. Then, we specify the runtime semantics of a number of aspect-oriented languages, namely of (1) Composition Filters, (2)Featherweight Java with assignments with an aspectual extension, and (3) a subset of multithreaded Java extended with a subset of AspectJ. We show that such semantics can be used to simulate a (partial) program and we expose the steps involved in the execution of such a program and that the resulting labelled transition systems can be used for existing verification methods. Then, we propose two novel approaches that address complications caused by the use of aspect-oriented programming. The approaches are based on the given semantics. The first approach allows the detection of aspect interference on shared join points by performing a confluence analysis on the resulting state space. The second approach allows the verification of run-time properties that require tracking of individual objects over time. This is achieved by extending the semantics with special rules for adding tracking information to the graphs. We show that the approach can be used for both object-oriented and aspect-oriented implementations.

Journal ArticleDOI
TL;DR: A set of novel formal semantics, such as deductive semantics, concept-algebra-based semantics, and visual semantics, is introduced that forms a theoretical and cognitive foundation for semantic computing.
Abstract: Semantics is the meaning of symbols, notations, concepts, functions, and behaviors, as well as their relations that can be deduced onto a set of predefined entities and/or known concepts. Semantic computing is an emerging computational methodology that models and implements computational structures and behaviors at semantic or knowledge level beyond that of symbolic data. In semantic computing, formal semantics can be classified into the categories of to be, to have, and to do semantics. This paper presents a comprehensive survey of formal and cognitive semantics for semantic computing in the fields of computational linguistics, software science, computational intelligence, cognitive computing, and denotational mathematics. A set of novel formal semantics, such as deductive semantics, concept-algebra-based semantics, and visual semantics, is introduced that forms a theoretical and cognitive foundation for semantic computing. Applications of formal semantics in semantic computing are presented in case studies on semantic cognition of natural languages, semantic analyses of computing behaviors, behavioral semantics of human cognitive processes, and visual semantic algebra for image and visual object manipulations.

Journal ArticleDOI
TL;DR: In the formal semantics based on modern type theories, common nouns are interpreted as types, rather than as functional subsets of entities as in Montague grammar, which brings about important advantages in linguistic interpretations but also leads to a limitation of expressive power as mentioned in this paper.
Abstract: In the formal semantics based on modern type theories, common nouns are interpreted as types, rather than as functional subsets of entities as in Montague grammar. This brings about important advantages in linguistic interpretations but also leads to a limitation of expressive power because there are fewer operations on types as compared with those on functional subsets. The theory of coercive subtyping adequately extends the modern type theories with a notion of subtyping and, as shown in this paper, plays a very useful role in making type theories more expressive for formal semantics. In particular, it gives a satisfactory treatment of the type-theoretic interpretation of modified common nouns and allows straightforward interpretations of interesting linguistic phenomena such as copredication, whose interpretations have been found difficult in a Montagovian setting. We shall also study some type-theoretic constructs that provide useful representational tools for formal lexical semantics, including how the so-called dot-types for representing logical polysemy may be expressed in a type theory with coercive subtyping.

Proceedings ArticleDOI
11 Jul 2010
TL;DR: This work solves the open problem of the decidability of Boolean BI logic and deduces an embedding between trivial phase semantics for intuitionistic linear logic (ILL) and Kripke semantics for BBI.
Abstract: We solve the open problem of the decidability of Boolean BI logic (BBI), which can be considered as the core of separation and spatial logics. For this, we define a complete phase semantics for BBI and characterize it as trivial phase semantics. We deduce an embedding between trivial phase semantics for intuitionistic linear logic (ILL) and Kripke semantics for BBI. We single out a fragment of ILL which is both undecidable and complete for trivial phase semantics. Therefore, we obtain the undecidability of BBI.

Journal ArticleDOI
TL;DR: The proof system is shown to satisfy Dummett’s harmony property, justifying the ND rules as meaning conferring, and the semantics is suitable for incorporation into computational linguistics grammars, formulated in type-logical grammar.
Abstract: The paper presents a proof-theoretic semantics (PTS) for a fragment of natural language, providing an alternative to the traditional model-theoretic (Montagovian) semantics (MTS), whereby meanings are truth-condition (in arbitrary models). Instead, meanings are taken as derivability-conditions in a “dedicated” natural-deduction (ND) proof-system. This semantics is effective (algorithmically decidable), adhering to the “meaning as use” paradigm, not suffering from several of the criticisms formulated by philosophers of language against MTS as a theory of meaning. In particular, Dummett’s manifestation argument does not obtain, and assertions are always warranted, having grounds of assertion. The proof system is shown to satisfy Dummett’s harmony property, justifying the ND rules as meaning conferring. The semantics is suitable for incorporation into computational linguistics grammars, formulated in type-logical grammar.

Proceedings ArticleDOI
11 Jul 2010
TL;DR: The approach is to extend Plotkin and Power's structural operational semantics for algebraic effects with a primitive "basic preorder" on ground type computation trees and proves fundamental properties of contextual preorder including extensionality properties and a characterisation via applicative contexts.
Abstract: We provide a syntactic analysis of contextual preorder and equivalence for a polymorphic programming language with effects. Our approach applies uniformly across a range of {algebraic effects}, and incorporates, as instances: errors, input/output, global state, nondeterminism, probabilistic choice, and combinations thereof. Our approach is to extend Plotkin and Power's structural operational semantics for algebraic effects (FoSSaCS 2001) with a primitive "basic preorder" on ground type computation trees. The basic preorder is used to derive notions of contextual preorder and equivalence on program terms. Under mild assumptions on this relation, we prove fundamental properties of contextual preorder (hence equivalence) including extensionality properties and a characterisation via applicative contexts, and we provide machinery for reasoning about polymorphism using relational parametricity.

Book ChapterDOI
20 Mar 2010
TL;DR: A theory of simulation is developed and used to validate the legality of the above optimizations in any program context and describes a semantics for the jmm using standard programming language techniques that captures its full expressivity.
Abstract: The specification of the Java Memory Model (jmm) is phrased in terms of acceptors of execution sequences rather than the standard generative view of operational semantics. This creates a mismatch with language-based techniques, such as simulation arguments and proofs of type safety. We describe a semantics for the jmm using standard programming language techniques that captures its full expressivity. For data-race-free programs, our model coincides with the jmm. For lockless programs, our model is more expressive than the jmm. The stratification properties required to avoid causality cycles are derived, rather than mandated in the style of the jmm. The jmm is arguably non-canonical in its treatment of the interaction of data races and locks as it fails to validate roach-motel reorderings and various peephole optimizations. Our model differs from the jmm in these cases. We develop a theory of simulation and use it to validate the legality of the above optimizations in any program context.

Journal ArticleDOI
TL;DR: The framework provides definitions that allow modeling of applications and execution semantics separately and can be used to analyze and compare how an application would behave when executed using different execution semantics.
Abstract: IEC 61499 is a standard architecture, based on function blocks, for developing distributed control and measurement applications. However, the standard has no formal semantics and different interpretations of the standard have emerged. As a consequence, it is harder to transfer applications between different standard compliant platforms. This paper presents a formal framework for mathematical modeling and comparison of different execution semantics. The framework provides definitions that allow modeling of applications and execution semantics separately. Together, the models can be used to analyze and compare how an application would behave when executed using different execution semantics. In addition, a mathematical model made possible by the framework has been used as a basis for implementation of a runtime environment that can execute applications and a software tool that generates formal models suitable for formal verification, both assuming different execution semantics.

Journal ArticleDOI
TL;DR: This work proposes here an extension of the Faber-Leone-Pfeifer semantics, or FLP semantics, to the full propositional language, which reveals both common threads and differences between the FLP and stable-model semantics.

Book ChapterDOI
20 Mar 2010
TL;DR: The foundations and use of a completely new version of the Maude Church-Rosser Checker tool that addresses all the above-mentioned challenges and can deal effectively with complex conditional specifications modulo axioms are presented.
Abstract: The Church-Rosser property, together with termination, is essential for an equational specification to have good executability conditions, and also for having a complete agreement between the specification's initial algebra, mathematical semantics, and its operational semantics by rewriting Checking this property for expressive specifications that are order-sorted, conditional with possibly extra variables in their condition, and whose equations can be applied modulo different combinations of associativity, commutativity and identity axioms is challenging In particular, the resulting conditional critical pairs that cannot be joined have often an intuitively unsatisfiable condition or seem intuitively joinable, so that sophisticated tool support is needed to eliminate them Another challenge is the presence of different combinations of associativity, commutativity and identity axioms, including the very challenging case of associativity without commutativity for which no finitary unification algorithms exist In this paper we present the foundations and illustrate the design and use of a completely new version of the Maude Church-Rosser Checker tool that addresses all the above-mentioned challenges and can deal effectively with complex conditional specifications modulo axioms

Proceedings ArticleDOI
17 Jan 2010
TL;DR: This work gives a small-step operational semantics of a prototypical functional language supporting programmer-definable, layered effects, and shows how this semantics naturally supports reasoning by familiar syntactic techniques, such as showing soundness of a Curry-style effect-type system by the progress+preservation method.
Abstract: In functional programming, monadic characterizations of computational effects are normally understood denotationally: they describe how an effectful program can be systematically expanded or translated into a larger, pure program, which can then be evaluated according to an effect-free semantics. Any effect-specific operations expressible in the monad are also given purely functional definitions, but these definitions are only directly executable in the context of an already translated program. This approach thus takes an inherently Church-style view of effects: the nominal meaning of every effectful term in the program depends crucially on its type.We present here a complementary, operational view of monadic effects, in which an effect definition directly induces an imperative behavior of the new operations expressible in the monad. This behavior is formalized as additional operational rules for only the new constructs; it does not require any structural changes to the evaluation judgment. Specifically, we give a small-step operational semantics of a prototypical functional language supporting programmer-definable, layered effects, and show how this semantics naturally supports reasoning by familiar syntactic techniques, such as showing soundness of a Curry-style effect-type system by the progress+preservation method.

Book ChapterDOI
28 Nov 2010
TL;DR: The soundness of the inference is proved with respect to a novel operational semantics for partial evaluations to show that the inferred bounds hold for terminating as well as non-terminating computations and that run-time bounds also establish the termination of programs.
Abstract: This paper studies the problem of statically determining upper bounds on the resource consumption of first-order functional programs. A previous work approached the problem with an automatic type-based amortized analysis for polynomial resource bounds. The analysis is parametric in the resource and can be instantiated to heap space, stack space, or clock cycles. Experiments with a prototype implementation have shown that programs are analyzed efficiently and that the computed bounds exactly match the measured worst-case resource behavior for many functions. This paper describes the inference algorithm that is used in the implementation of the system. It can deal with resource-polymorphic recursion which is required in the type derivation of many functions. The computation of the bounds is fully automatic if a maximal degree of the polynomials is given. The soundness of the inference is proved with respect to a novel operational semantics for partial evaluations to show that the inferred bounds hold for terminating as well as non-terminating computations. A corollary is that run-time bounds also establish the termination of programs.

Proceedings ArticleDOI
23 Jan 2010
TL;DR: System F° is presented, an extension of System F that uses kinds to distinguish between linear and unrestricted types, simplifying the use of linearity for general-purpose programming and allowing otherwise-conflicting interpretations oflinearity to coexist peacefully.
Abstract: We present System F°, an extension of System F that uses kinds to distinguish between linear and unrestricted types, simplifying the use of linearity for general-purpose programming. We demonstrate through examples how System F° can elegantly express many useful protocols, and we prove that any protocol representable as a DFA can be encoded as an F° type. We supply mechanized proofs of System F°'s soundness and parametricity properties, along with a nonstandard operational semantics that formalizes common intuitions about linearity and aids in reasoning about protocols.We compare System F° to other linear systems, noting that the simplicity of our kind-based approach leads to a more explicit account of what linearity is meant to capture, allowing otherwise-conflicting interpretations of linearity (in particular, restrictions on aliasing versus restrictions on resource usage) to coexist peacefully. We also discuss extensions to System F^o aimed at making the core language more practical, including the additive fragment of linear logic, algebraic datatypes, and recursion.

Proceedings ArticleDOI
30 Sep 2010
TL;DR: This paper develops a new, much cleaner semantics, for such future implementations of Erlang, and hopes that this paper can stimulate some much needed debate regarding a number of poorly understood features of current and future implementationsof Erlang.
Abstract: The formal semantics of Erlang is a bit too complicated to be easily understandable. Much of this complication stems from the desire to accurately model the current implementations (Erlang/OTP R11-R14), which include features (and optimizations) developed during more than two decades. The result is a two-tier semantics where systems, and in particular messages, behave differently in a local and a distributed setting. With the introduction of multi-core hardware, multiple run-queues and efficient SMP support, the boundary between local and distributed is diffuse and should ultimately be removed. In this paper we develop a new, much cleaner semantics, for such future implementations of Erlang. We hope that this paper can stimulate some much needed debate regarding a number of poorly understood features of current and future implementations of Erlang.

Book ChapterDOI
Hagen Völzer1
13 Sep 2010
TL;DR: A key insight is that not all situations where two or more Or-joins seem to be mutually dependent are necessarily symmetric, and enabledness of an Or-join in the semantics can be decided in linear time in the size of the workflow graph.
Abstract: We propose a new semantics for the inclusive converging gateway (also known as Or-join). The new semantics coincides with the intuitive, widely agreed semantics for Or-joins on sound acyclic workflow graphs which is implied, for example, by dead path elimination on BPEL flows. The new semantics also coincides with the block-based semantics as used in BPEL on cyclic graphs that can be composed from sound acyclic graphs, repeat- and while-loops. Furthermore, we display several examples for unstructured workflow graphs for which Or-joins get the desired intuitive semantics. A key insight is that not all situations where two or more Or-joins seem to be mutually dependent (known as 'vicious circles') are necessarily symmetric. Many such situations are asymmetric and can be resolved naturally in favor of one of the Or-joins. Still symmetric or almost symmetric situations exist, for which it is not clear what semantics is desirable and which result in a deadlock in our semantics. We show that enabledness of an Or-join in our semantics can be decided in linear time in the size of the workflow graph.

Book ChapterDOI
11 Oct 2010
TL;DR: This paper is using two different state-of-the-art proof techniques (explicit bisimulation construction versus borrowed contexts) to show bisimilarity preservation of a given model transformation between two simple languages, both of which are equipped with a graph transformation-based operational semantics.
Abstract: Model transformation is a prime technique in modern, model-driven software design. One of the most challenging issues is to show that the semantics of the models is not affected by the transformation. So far, there is hardly any research into this issue, in particular in those cases where the source and target languages are different. In this paper, we are using two different state-of-the-art proof techniques (explicit bisimulation construction versus borrowed contexts) to show bisimilarity preservation of a given model transformation between two simple (self-defined) languages, both of which are equipped with a graph transformation-based operational semantics. The contrast between these proof techniques is interesting because they are based on different model transformation strategies: triple graph grammars versus in situ transformation. We proceed to compare the proofs and discuss scalability to a more realistic setting.

Proceedings ArticleDOI
30 Sep 2010
TL;DR: This paper shows how call-by-need supercompilation can be recast to be based explicitly on an evaluator, contrasting with standard presentations which are specified as algorithms that mix evaluation rules with reductions that are unique tosupercompilation.
Abstract: This paper shows how call-by-need supercompilation can be recast to be based explicitly on an evaluator, contrasting with standard presentations which are specified as algorithms that mix evaluation rules with reductions that are unique to supercompilation. Building on standard operational-semantics technology for call-by-need languages, we show how to extend the supercompilation algorithm to deal with recursive let expressions.