scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 2007"


Proceedings Article
06 Jan 2007
TL;DR: Essence is a formal language for specifying combinatorial problems in a manner similar to natural rigorous specifications that use a mixture of natural language and discrete mathematics.
Abstract: ESSENCE is a new formal language for specifying combinatorial problems in a manner similar to natural rigorous specifications that use a mixture of natural language and discrete mathematics. ESSENCE provides a high level of abstraction, much of which is the consequence of the provision of decision variables whose values can be combinatorial objects, such as tuples, sets, multisets, relations, partitions and functions. ESSENCE also allows these combinatorial objects to be nested to arbitrary depth, thus providing, for example, sets of partitions, sets of sets of partitions, and so forth. Therefore, a problem that requires finding a complex combinatorial object can be directly specified by using a decision variable whose type is precisely that combinatorial object.

215 citations


Journal ArticleDOI
TL;DR: These are the first industrial examples that have been synthesized automatically from their specifications and shown their practicality by synthesizing a generalized buffer and an arbiter for ARM's AMBA AHB bus from specifications given in PSL.

158 citations


Book
01 Jan 2007
TL;DR: This book proposes a distributed pi-calculus called Dpi, for describing the behaviour of mobile agents in a distributed world, based on an existing formal language, the pi-Calculus, to which it adds a network layer and a primitive migration construct.
Abstract: Distributed systems are fast becoming the norm in computer science. Formal mathematical models and theories of distributed behaviour are needed in order to understand them. This book proposes a distributed pi-calculus called Dpi, for describing the behaviour of mobile agents in a distributed world. It is based on an existing formal language, the pi-calculus, to which it adds a network layer and a primitive migration construct. A mathematical theory of the behaviour of these distributed systems is developed, in which the presence of types plays a major role. It is also shown how in principle this theory can be used to develop verification techniques for guaranteeing the behavior of distributed agents. The text is accessible to computer scientists with a minimal background in discrete mathematics. It contains an elementary account of the pi-calculus, and the associated theory of bisimulations. It also develops the type theory required by Dpi from first principles. • First book on formal foundations of distributed computation • Accessible introduction to the theory of the pi-calculus, with many exercises • Contains many worked examples and over 70 exercises

150 citations


Journal ArticleDOI
TL;DR: In this article, it is shown that constructing a theory of physics is equivalent to finding a representation in a topos of a certain formal language that is attached to the system. But the main thrust of the main focus of this paper is on the more powerful language L(S) and its representation in an appropriate topos.
Abstract: This paper is the first in a series whose goal is to develop a fundamentally new way of constructing theories of physics. The motivation comes from a desire to address certain deep issues that arise when contemplating quantum theories of space and time. Our basic contention is that constructing a theory of physics is equivalent to finding a representation in a topos of a certain formal language that is attached to the system. Classical physics arises when the topos is the category of sets. Other types of theory employ a different topos. In this paper we discuss two different types of language that can be attached to a system, S. The first is a propositional language, PL(S); the second is a higher-order, typed language L(S). Both languages provide deductive systems with an intuitionistic logic. The reason for introducing PL(S) is that, as shown in paper II of the series, it is the easiest way of understanding, and expanding on, the earlier work on topos theory and quantum physics. However, the main thrust of our programme utilises the more powerful language L(S) and its representation in an appropriate topos.

144 citations


Proceedings ArticleDOI
16 Apr 2007
TL;DR: These are the first industrial examples that have been synthesized automatically from their specifications and shown their practicality by synthesizing an arbiter for ARM's AMBA AHB bus and a generalized buffer from specifications given in PSL.
Abstract: We propose to use a formal specification language as a high-level hardware description language. Formal languages allow for compact, unambiguous representations and yield designs that are correct by construction. The idea of automatic synthesis from specifications is old, but used to be completely impractical. Recently, great strides towards efficient synthesis from specifications have been made. In this paper we extend these recent methods to generate compact circuits and we show their practicality by synthesizing an arbiter for ARM's AMBA AHB bus and a generalized buffer from specifications given in PSL. These are the first industrial examples that have been synthesized automatically from their specifications

123 citations


Book
01 Jan 2007
TL;DR: A presentation of the theory of computing, including coverage of the Theory of formal languages and automata, compatability, computational complexity, and deterministic parsing of context-free languages.
Abstract: A presentation of the theory of computing, including coverage of the theory of formal languages and automata, compatability, computational complexity, and deterministic parsing of context-free languages.

122 citations


Book ChapterDOI
06 Jun 2007
TL;DR: This paper proposes an extension of the µ-calculus in order to capture the intuitive meaning of the deontic notions and to express concurrent actions, and provides a translation of the contract language into the logic, the semantics of which faithfully captures the meaning of obligation, permission and prohibition.
Abstract: In this paper we propose a formal language for writing electronic contracts, based on the deontic notions of obligation, permission, and prohibition. We take an ought-to-do approach, where deontic operators are applied to actions instead of state-of-affairs. We propose an extension of the µ-calculus in order to capture the intuitive meaning of the deontic notions and to express concurrent actions. We provide a translation of the contract language into the logic, the semantics of which faithfully captures the meaning of obligation, permission and prohibition. We also show how our language captures most of the intuitive desirable properties of electronic contracts, as well as how it avoids most of the classical paradoxes of deontic logic. We finally show its applicability on a contract example.

114 citations


Proceedings ArticleDOI
16 Apr 2007
TL;DR: These are the first industrial examples that have been synthesized automatically from their specifications and shown their practicality by synthesizing an arbiter for ARM's AMBA AHB bus and a generalized buffer from specifications given in PSL.
Abstract: We propose to use a formal specification language as a high-level hardware description language. Formal languages allow for compact, unambiguous representations and yield designs that are correct by construction. The idea of automatic synthesis from specifications is old, but used to be completely impractical. Recently, great strides towards efficient synthesis from specifications have been made. In this paper we extend these recent methods to generate compact circuits and we show their practicality by synthesizing an arbiter for ARM's AMBA AHB bus and a generalized-buffer from specifications given in PSL. These are the first industrial examples that have been synthesized automatically from their specifications.

90 citations


Proceedings ArticleDOI
04 Dec 2007
TL;DR: A formal language called the aircraft intent description language (AIDL) is proposed as a standard, interoperable means of describing and exchanging predicted aircraft trajectories in trajectory-based operations (TBO) and encompasses all other methods and formats that may be used to describe aircraft intent.
Abstract: In this paper, a formal language called the aircraft intent description language (AIDL) is proposed as a standard, interoperable means of describing and exchanging predicted aircraft trajectories in trajectory-based operations (TBO). The AIDL provides the necessary elements to unambiguously formulate aircraft intent, which, in the context of trajectory prediction, refers to the information that describes how the aircraft is to be operated within a certain time interval. By expressing aircraft intent according to the AIDL, it is ensured that each instance of aircraft intent defines a unique trajectory. It is anticipated that sharing aircraft intent information expressed in a structured and formal manner, e.g. according to the AIDL, can facilitate the synchronization of the predicted aircraft trajectories held by different automation systems in the context of TBO. The AIDL is characterized by an alphabet and a grammar. The definition of the alphabet and the grammar rules are based on a rigorous mathematical analysis of the trajectory computation process at the core of a TP. This analysis relies on a novel approach to modeling the trajectory computation process based on the theory of differential algebraic equations (DAEs). The paper presents the alphabet of the AIDL, which contains a set of instructions that capture the individual commands and guidance modes available to direct the motion of an aircraft in the ATM context. Then, the AIDL grammar rules, which define the possible combinations of the instructions in the alphabet, are defined and mathematically justified. The AIDL seeks to exploit the physical and mathematical foundations underlying the trajectory computation process to allow describing a priori (i.e. before a trajectory is actually computed) any possible motion behavior that can reasonably be elicited from an aircraft in the ATM context. The objective is that the AIDL encompasses all other methods and formats that may be used to describe aircraft intent, i.e. they would be subsets of the AIDL. The AIDL could thus be seen as a metalanguage for aircraft intent description, containing any other language that may be used to describe aircraft intent in the context of TBO.

75 citations


Book ChapterDOI
Claude-Guy Quimper, Toby Walsh1
23 Sep 2007
TL;DR: Based on an AND/OR decomposition, it is shown that the GRAMMAR constraint can be converted into clauses in conjunctive normal form without hindering propagation and used as an efficient incremental propagator.
Abstract: A wide range of constraints can be specified using automata or formal languages. The GRAMMAR constraint restricts the values taken by a sequence of variables to be a string from a given context-free language. Based on an AND/OR decomposition, we show that this constraint can be converted into clauses in conjunctive normal form without hindering propagation. Using this decomposition, we can propagate the GRAMMAR constraint in O(n3) time. The decomposition also provides an efficient incremental propagator. Down a branch of the search tree of length k, we can enforce GAC k times in the same O(n3) time. On specialized languages, running time can be even better. For example, propagation of the decomposition requires just O(n|δ|) time for regular languages where |δ| is the size of the transition table of the automaton recognizing the regular language. Experiments on a shift scheduling problem with a constraint solver and a state of the art SAT solver show that we can solve problems using this decomposition that defeat existing constraint solvers.

73 citations


Book ChapterDOI
03 Apr 2007
TL;DR: It is shown that finite automata over infinite words offer analyzable representation and can capture many interesting interface specifications such as exponential stability of switched linear systems.
Abstract: We propose the use of formal languages of infinite words over the alphabet of task identifiers as an interface between control designs and software implementations. We argue that this approach is more flexible than the classical real-time scheduling framework based on periodic tasks, and allows composition of interfaces by language-theoretic operations. We show that finite automata over infinite words offer analyzable representation and can capture many interesting interface specifications such as exponential stability of switched linear systems.

Book ChapterDOI
R. H. Baayen1
01 Jan 2007
TL;DR: This chapter shows that the pocket calculator provides a fundamentally flawed metaphor for understanding morphological structure and processing in the mental lexicon, and provides an indication of the kind of formal mathematical model that may help to understand the process and representation better in themental lexicon.
Abstract: This chapter shows that the pocket calculator provides a fundamentally flawed metaphor for understanding morphological structure and processing in the mental lexicon. To this end, it surveys the evidence from experimental studies of lexical processing, and then considers the fine phonetic detail that is present in the acoustic signal. The chapter provides an indication of the kind of formal mathematical model that may help to understand the process and representation better in the mental lexicon. It also discusses theory of speech production, WEAVER models, phonetic evidence, and Hierarchical Temporal Memory (HTM). Whereas the mathematics of formal languages has been a key source of inspiration for morphological theory and models of the mental lexicon, new advances at the intersection of statistics, information science and the neurosciences such as HTM can constitute an important source of inspiration for research on the mental lexicon during the coming years.Keywords: acoustic signal; Lieber; mathematical model; mental lexicon; pocket calculator; Selkirk

Journal ArticleDOI
TL;DR: This paper shows that several versions of exhaustive DPLL search correspond to such well-known languages as FBDD, OBDD, and a precisely-defined subset of d-DNNF.
Abstract: This paper is concerned with a class of algorithms that perform exhaustive search on propositional knowledge bases. We show that each of these algorithms defines and generates a propositional language. Specifically, we show that the trace of a search can be interpreted as a combinational circuit, and a search algorithm then defines a propositional language consisting of circuits that are generated across all possible executions of the algorithm. In particular, we show that several versions of exhaustive DPLL search correspond to such well-known languages as FBDD, OBDD, and a precisely-defined subset of d-DNNF. By thus mapping search algorithms to propositional languages, we provide a uniform and practical framework in which successful search techniques can be harnessed for compilation of knowledge into various languages of interest, and a new methodology whereby the power and limitations of search algorithms can be understood by looking up the tractability and succinctness of the corresponding propositional languages.

Proceedings ArticleDOI
09 Dec 2007
TL;DR: A framework for region-based synthesis of Petri nets from languages which integrates almost all known approaches and fills several remaining gaps in literature is presented.
Abstract: In this paper we present a survey on methods for the synthesis of Petri nets from behavioral descriptions given as languages. We consider place/transition Petri nets, elementary Petri nets and Petri nets with inhibitor arcs. For each net class we consider classical languages, step languages and partial languages as behavioral description. All methods are based on the notion of regions of languages. We identify two different types of regions and two different principles of computing from the set of regions of a language a finite Petri net generating this language. For finite or regular languages almost each combination of Petri net class, language type, region type and computation principle can be considered to compute such a net. Altogether, we present a framework for region-based synthesis of Petri nets from languages which integrates almost all known approaches and fills several remaining gaps in literature.

Journal ArticleDOI
TL;DR: This paper describes Christiansen grammar evolution (CGE), a new evolutionary automatic programming algorithm that extends standard grammar evolution by replacing context-free grammars by Christiansen Grammars.
Abstract: This paper describes Christiansen grammar evolution (CGE), a new evolutionary automatic programming algorithm that extends standard grammar evolution (GE) by replacing context-free grammars by Christiansen grammars. GE only takes into account syntactic restrictions to generate valid individuals. CGE adds semantics to ensure that both semantically and syntactically valid individuals are generated. It is empirically shown that our approach improves GE performance and even allows the solution of some problems are difficult to tackle by GE

Journal ArticleDOI
01 Jun 2007
TL;DR: A goal-oriented requirements engineering framework which combines complementary semi-formal and formal notations allows the analyst to formalize only when and where needed and also preserves optimal communication with stakeholders and developers is shown.
Abstract: Complex software and systems are pervasive in today's world. In a growing number of fields they come to play a critical role. In order to provide a high assurance level, verification and validation (V&V) should be considered early in the development process. This paper shows how this can be achieved based on a goal-oriented requirements engineering framework which combines complementary semi-formal and formal notations. This allows the analyst to formalize only when and where needed and also preserves optimal communication with stakeholders and developers. For the industrial application of the methodology, a supporting toolbox was developed. It consist of a number of tightly integrated tools for performing V&V tasks at requirements level. This is achieved through the use of (1) a roundtrip mapping between the requirements language and the specific formal languages used in the underlying formal tools (such as SAT or constraint solvers) and (2) graphical views using domain-based representations. This paper will focus on two major and representative tools: the Refinement Checker (about verification) and the Animator (about validation).

11 Jun 2007
TL;DR: This work presents a novel approach to support multilingual verbalization of logical theories, axiomatizations, and other specifications such as business rules with flexibility, extensibility and maintainability of the verbalization templates, which allow for easy augmentation with other languages than the 10 currently supported.
Abstract: Verbalization is the process of writing the semantics captured in axioms into natural language sentences, which enables domain experts (who are not trained to understand technical/formal languages) to be able to participate in the modeling and validation processes of their domain knowledge. We present a novel approach to support multilingual verbalization of logical theories, axiomatizations, and other specifications such as business rules. This engineering solution is demonstrated with the Object Role Modeling language and the ontology engineering tool DogmaModeler, although its underlying principles can be reused with other conceptual models and formal languages, such as Description Logics, to improve its understandability and usability by the domain expert. Our engineering solution for multilingual verbalization is characterized by its flexibility, extensibility and maintainability of the verbalization templates, which allow for easy augmentation with other languages than the 10 currently supported.

Book
01 Jan 2007
TL;DR: This chapter discusses automata theory in the context of finite state machines, which is concerned with Turing Machines, and its applications in linguistics, where Turing Machines are concerned with language recognition.
Abstract: PART I: INTRODUCTION 1 Why Study Automata Theory? 2 Review of Mathematical Concepts 2.1 Logic 2.2 Sets 2.3 Relations 2.4 Functions 2.5 Closures 2.6 Proof Techniques 2.7 Reasoning about Programs 2.8 References 3 Languages and Strings 3.1 Strings 3.2 Languages 4 The Big Picture: A Language Hierarchy 4.1 Defining the Task: Language Recognition 4.2 The Power of Encoding 4.3 A Hierarchy of Language Classes 5 Computation 5.1 Decision Procedures 5.2 Determinism and Nondeterminism 5.3 Functions on Languages and Programs PART II: FINITE STATE MACHINES AND REGULAR LANGUAGES 6 Finite State Machines 6.2 Deterministic Finite State Machines 6.3 The Regular Languages 6.4 Programming Deterministic Finite State Machines 6.5 Nondeterministic FSMs 6.6 Interpreters for FSMs 6.7 Minimizing FSMs 6.8 Finite State Transducers 6.9 Bidirectional Transducers 6.10 Stochastic Finite Automata 6.11 Finite Automata, Infinite Strings: Buchi Automata 6.12 Exercises 7 Regular Expressions 7.1 What is a Regular Expression? 7.2 Kleene's Theorem 7.3 Applications of Regular Expressions 7.4 Manipulating and Simplifying Regular Expressions 8 Regular Grammars 8.1 Definition of a Regular Grammar 8.2 Regular Grammars and Regular Languages 8.3 Exercises 9 Regular and Nonregular Languages 9.1 How Many Regular Languages Are There? 9.2 Showing That a Language Is Regular.124 9.3 Some Important Closure Properties of Regular Languages 9.4 Showing That a Language is Not Regular 9.5 Exploiting Problem-Specific Knowledge 9.6 Functions on Regular Languages 9.7 Exercises 10 Algorithms and Decision Procedures for Regular Languages 10.1 Fundamental Decision Procedures 10.2 Summary of Algorithms and Decision Procedures for Regular Languages 10.3 Exercises 11 Summary and References PART III: CONTEXT-FREE LANGUAGES AND PUSHDOWN AUTOMATA 144 12 Context-Free Grammars 12.1 Introduction to Grammars 12.2 Context-Free Grammars and Languages 12.3 Designing Context-Free Grammars 12.4 Simplifying Context-Free Grammars 12.5 Proving That a Grammar is Correct 12.6 Derivations and Parse Trees 12.7 Ambiguity 12.8 Normal Forms 12.9 Stochastic Context-Free Grammars 12.10 Exercises 13 Pushdown Automata 13.1 Definition of a (Nondeterministic) PDA 13.2 Deterministic and Nondeterministic PDAs 13.3 Equivalence of Context-Free Grammars and PDAs 13.4 Nondeterminism and Halting 13.5 Alternative Definitions of a PDA 13.6 Exercises 14 Context-Free and Noncontext-Free Languages 14.1 Where Do the Context-Free Languages Fit in the Big Picture? 14.2 Showing That a Language is Context-Free 14.3 The Pumping Theorem for Context-Free Languages 14.4 Some Important Closure Properties of Context-Free Languages 14.5 Deterministic Context-Free Languages 14.6 Other Techniques for Proving That a Language is Not Context-Free 14.7 Exercises 15 Algorithms and Decision Procedures for Context-Free Languages 15.1 Fundamental Decision Procedures 15.2 Summary of Algorithms and Decision Procedures for Context-Free Languages 16 Context-Free Parsing 16.1 Lexical Analysis 16.2 Top-Down Parsing 16.3 Bottom-Up Parsing 16.4 Parsing Natural Languages 16.5 Stochastic Parsing 16.6 Exercises 17 Summary and References PART IV: TURING MACHINES AND UNDECIDABILITY 18 Turing Machines 18.1 Definition, Notation and Examples 18.2 Computing With Turing Machines 18.3 Turing Machines: Extensions and Alternative Definitions 18.4 Encoding Turing Machines as Strings 18.5 The Universal Turing Machine 18.6 Exercises 19 The Church-Turing 19.1 The Thesis 19.2 Examples of Equivalent Formalisms 20 The Unsolvability of the Halting Problem 20.1 The Language H is Semidecidable but Not Decidable 20.2 Some Implications of the Undecidability of H 20.3 Back to Turing, Church, and the Entscheidungsproblem 21 Decidable and Semidecidable Languages 21.2 Subset Relationships between D and SD 21.3 The Classes D and SD Under Complement 21.4 Enumerating a Language 21.5 Summary 21.6 Exercises 22 Decidability and Undecidability Proofs 22.1 Reduction 22.2 Using Reduction to Show that a Language is Not Decidable 22.3 Rice's Theorem 22.4 Undecidable Questions About Real Programs 22.5 Showing That a Language is Not Semidecidable 22.6 Summary of D, SD/D and (R)SD Languages that Include Turing Machine Descriptions 22.7 Exercises 23 Undecidable Languages That Do Not Ask Questions about Turing Machines 23.1 Hilbert's 10th Problem 23.2 Post Correspondence Problem 23.3 Tiling Problems 23.4 Logical Theories 23.5 Undecidable Problems about Context-Free Languages APPENDIX C: HISTORY, PUZZLES, AND POEMS 43 Part I: Introduction 43.1 The 15-Puzzle Part II: Finite State Machines and Regular Languages 44.1 Finite State Machines Predate Computers 44.2 The Pumping Theorem Inspires Poets REFERENCES INDEX Appendices for Automata, Computability and Complexity: Theory and Applications: * Math Background* Working with Logical Formulas* Finite State Machines and Regular Languages* Context-Free Languages and PDAs* Turing Machines and Undecidability* Complexity* Programming Languages and Compilers* Tools for Programming, Databases and Software Engineering* Networks* Security* Computational Biology* Natural Language Processing* Artificial Intelligence and Computational Reasoning* Art & Entertainment: Music & Games* Using Regular Expressions* Using Finite State Machines and Transducers* Using Grammars

Journal Article
TL;DR: The invention comprises a biocidal composition useful in treating industrial process waters to prevent and control the growth of gram-negative bacteria that contains a synergistic mixture of 2-(p-hydroxyphenol) glyoxylohydroxymoyl chloride and 2,2-dibromo-3-nitrilopropionamide.
Abstract: This paper presents a formalisation of the different existing code mutation techniques (polymorphism and metamor- phism) by means of formal grammars. While very few theoretical results are known about the detection complexity of viral mutation techniques, we exhaustively address this critical issue by considering the Chomsky classification of formal grammars. This enables us to determine which family of code mutation techniques are likely to be detected or on the contrary are bound to remain undetected. As an illustration we then present, on a formal basis, a proof-of-concept metamorphic mutation engine denoted PB MOT, whose detection has been proven to be undecidable. Keywords—Polymorphism, Metamorphism, Formal Grammars, Formal Languages, Language Decision, Code Mutation, Word Prob- lem. that the set Di of polymorphic viruses with an infinite number of forms is a Σ3-complete set. Unfortunately, no results is known for other classes of polymorphic viruses and for the general case of metamorphism. Many open problems still remain. Up to now, only very few examples of metamorphic codes are known to exist. The most sophisticated one is the MetaPHOR engine whose essential feature is a certain amount of non-determinism. Experiments in our laboratory showed that existing antivirus software can be very easily defeated by MetaPHOR-like technology. However, the analysis of this engine (9, Chap. 4) has proved that its metamorphic techniques still belong to trivial classes. Our research thus focused on the formalisation of metamor- phism by means of formal grammar and languages. We aimed at identifying the different possible classes of possible code mutation techniques. The first results, which are presented in this paper, enable to assert that detection complexity of code mutation techniques can be far higher that NP-complete and that for some well-chosen classes, detection is an undecidable problem.

Proceedings Article
01 Jan 2007
TL;DR: A dosage device 2 or 102 which includes a body 20 on which a syringe 4 and a container of medication 6 may be mounted and structure for indicating the position of the stop member relative to the body to a visually impaired or blind person.
Abstract: A dosage device 2 or 102 which includes a body 20 on which a syringe 4 and a container of medication 6 may be mounted. An adjustable stop member 42 or 142 is located in back of the plunger 12 of syringe 4. The dosage level can be changed by varying the position of the stop member. In addition, the dosage device includes structure for indicating the position of the stop member relative to the body to a visually impaired or blind person. This position indicating structure is effective whenever the stop member moves through one of a plurality of discrete positions to allow the blind person to change the dosage level of the syringe by sensing and counting the indications generated by the position indicator.

01 Jan 2007
TL;DR: The BLOG model as discussed by the authors is a formal language for defining probability models with unknown objects and identity uncertainty, and it can be used to describe a generative process in which some steps add objects to the world, and others determine attributes and relations on these objects.
Abstract: We introduce BLOG, a formal language for defining probability models with unknown objects and identity uncertainty. A BLOG model describes a generative process in which some steps add objects to the world, and others determine attributes and relations on these objects. Subject to certain acyclicity constraints, a BLOG model specifies a unique probability distribution over first-order model structures that can contain varying and unbounded numbers of objects. Furthermore, inference algorithms exist for a large class of BLOG models.

Journal ArticleDOI
01 Aug 2007
TL;DR: It turns out that, in order to take full advantage of the four values of Belnap's logic, one has to consider ''sub-normalised'' necessity measures.
Abstract: The use of positive and negative reasons in inference and decision aiding is a recurrent issue of investigation as far as the type of formal language to use within a DSS is concerned. A language enabling to explicitly take into account such reasons is Belnap's logic and the four valued logics derived from it. In this paper, we explore the interpretation of a continuous extension of a four valued logic as a necessity degree (in possibility theory). It turns out that, in order to take full advantage of the four values, we have to consider ''sub-normalised'' necessity measures. Under such a hypothesis four valued logics become the natural logical frame for such an approach.

Patent
Kiran Pal Sagoo1, Kyung-Im Jung1
08 Feb 2007
TL;DR: In this article, a capability-monitoring unit that monitors application capabilities, a behavior-aware unit that detects malicious application behaviors, and a controlling unit that controls execution of application using the formal language.
Abstract: A device for using information on malicious application behaviors is provided. The device includes a capability-monitoring unit that monitors application capabilities, a behavior-monitoring unit that monitors application behaviors, an mBDL-generating unit that generates a document in a formal language specifying the application capabilities and the application behaviors, and a controlling unit that controls execution of application using the formal language.

Journal ArticleDOI
24 Oct 2007
TL;DR: In this paper, a comprehensive theory for optimal control of finite-state probabilistic processes is presented, and it is shown that the resulting discrete-event supervisor is optimal in the sense of elementwise maximizing the renormalized language measure vector for the controlled plant behavior and is efficiently computable.
Abstract: Supervisory control theory for discrete event systems, introduced by Ramadge and Wonham, is based on a non-probabilistic formal language framework. Building on the concept of signed real measure of regular languages, this paper formulates a comprehensive theory for optimal control of finite-state probabilistic processes. It is shown that the resulting discrete-event supervisor is optimal in the sense of elementwise maximizing the renormalized language measure vector for the controlled plant behavior and is efficiently computable.

Book ChapterDOI
07 Jun 2007
TL;DR: This paper provides inconsistency tolerant semantics for DLs, and studies the computational complexity of consistent query answering over ontologies specified in DL-Lite, a family of DLs specifically tailored to deal with large amounts of data.
Abstract: Description Logics (DLs) have been widely used in the last years as formal language for specifying ontologies over the web. Due to the dynamic nature of this setting, it may frequently happen that data retrieved from the web contradict the intensional knowledge provided by the ontology through which they are collected, which therefore may result inconsistent. In this paper, we analyze the problem of consistent query answering over DL ontologies, i.e., the problem of providing meaningful answers to queries posed over inconsistent ontologies. We provide inconsistency tolerant semantics for DLs, and study the computational complexity of consistent query answering over ontologies specified in DL-Lite, a family of DLs specifically tailored to deal with large amounts of data. We show that the above problem is coNP-complete w.r.t. data complexity, i.e., the complexity measured w.r.t. the size of the data only. Towards identification of tractable cases of consistent query answering over DL-Lite ontologies, we then study the problem of consistent instance checking, i.e., the instance checking problem considered under our inconsistency-tolerant semantics. We provide an algorithm for it which runs in time polynomial in the size of the data, thus showing that the problem is in PTIME w.r.t. data complexity.

Book ChapterDOI
03 Jul 2007
TL;DR: This paper presents a requirement analysis tool, RAT, which supports quality assurance of formal specifications and can interactively explore the requirements' semantics and automatically check the specification against assertions and possibilities.
Abstract: Formal languages are increasingly used to describe the functional requirements of circuits. Although formal requirements can be hard to understand and subtle, they are seldom the object of verification. In this paper we present our requirement analysis tool, RAT. Our tool supports quality assurance of formal specifications. A designer can interactively explore the requirements' semantics and automatically check the specification against assertions (which must be satisfied) and possibilities (which describe allowed corner-case behavior). Using RAT, a designer can also investigate the realizability of a specification. RAT was successfully examined in several industrial projects.

Journal ArticleDOI
TL;DR: The authors argue that Frege did make very serious use of semantical concepts such as reference and truth in the formal theory of Frege, and that he explicitly committed himself to off-ering a justifi cation that appeals to the notion of reference.
Abstract: In recent work on Frege, one of the most salient issues has been whether he was prepared to make serious use of semantical notions such as reference and truth. I argue here Frege did make very serious use of semantical concepts. I argue, fi rst, that Frege had reason to be interested in the question how the axioms and rules of his formal theory might be justifi ed and, second, that he explicitly commits himself to off ering a justifi cation that appeals to the notion of reference. I then discuss the justifi cations Frege off ered, focusing on his discussion of inferences involving free variables, in section 17 of Grundgesetze, and his argument, in sections 29–32, that every well-formed expression of his formal language has a unique reference.

Proceedings ArticleDOI
01 Jan 2007

Book ChapterDOI
27 Jun 2007
TL;DR: The extended finite state machine model has the capability of representing a class of timing faults, which otherwise may not be detected in an IUT, which is applied to the SDL specification based on the EFSM model to generate a test sequence that can detect these timing faults.
Abstract: In this paper, we apply our timing fault modeling strategy to writing formal specifications for communication protocols. Using the formal language of Specification and Description Language (SDL), we specify the Controller process of rail-road crossing system, a popular benchmark for real-time systems. Our extended finite state machine (EFSM) model has the capability of representing a class of timing faults, which otherwise may not be detected in an IUT. Hit-or-Jumpalgorithm is applied to the SDL specification based on our EFSM model to generate a test sequence that can detect these timing faults. This application of fault modeling into SDL specification ensures the synchronization among the timing constraints of different processes, and enables generation of portable test sequences since they can be easily represented in other formal notations such as TTCN or MSC.

Proceedings ArticleDOI
25 Jun 2007
TL;DR: This paper shows how to further increase the interaction in the FLA course with the expansion of additional theoretical topics in JFLAP, and how it has added grading support into JFLap for instructors.
Abstract: The introduction of educational software such as JFLAP into the course Formal Languages and Automata (FLA) has created a learning environment with automatic feedback on theoretical topics. In this paper we show how we further increase the interaction in the FLA course with the expansion of additional theoretical topics in JFLAP, and how we have added grading support into JFLAP for instructors.