scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 1991"


Journal ArticleDOI
TL;DR: It is shown that the class of programs possessing a total well-founded model properly includes previously studied classes of "stratified" and "locally stratified" programs, and is compared with other proposals in the literature.
Abstract: A general logic program (abbreviated to "program" hereafter) is a set of roles that have both positive and negative subgoals. It is common to view a deductive database as a general logic program consisting of rules (IDB) slttmg above elementary relations (EDB, facts). It is desirable to associate one Herbrand model with a program and think of that model as the "meaning of the program, " or Its "declarative semantics. " Ideally, queries directed to the program would be answered in accordance with this model. Recent research indicates that some programs do not have a "satisfactory" total model; for such programs, the question of an appropriate partial model arises. Unfounded sets and well-founded partial models are introduced and the well-founded semantics of a program are defined to be its well-founded partial model. If the well-founded partial model is m fact a total model. it is called the well-founded model. It n shown that the class of programs possessing a total well-founded model properly includes previously studied classes of "stratified" and "locally stratified" programs, The method in this paper is also compared with other proposals in the literature, including Clark's "program completion, " Fitting's and Kunen's 3-vahred interpretations of it, and the "stable models" of Gelfond and Lifschitz.

1,908 citations


Proceedings Article
01 Jan 1991
TL;DR: In this article, a complexity analysis of concept satisfiability and subsumption for a wide class of concept languages is presented, together with algorithms for these inferences that comply with the worst-case complexity of the reasoning task they perform.
Abstract: Abstract A basic feature of Terminological Knowledge Representation Systems is to represent knowledge by means of taxonomies, here called terminologies, and to provide a specialized reasoning engine to do inferences on these structures. The taxonomy is built through a representation language called a concept language (or description logic ), which is given a well-defined set-theoretic semantics. The efficiency of reasoning has often been advocated as a primary motivation for the use of such systems. The main contributions of the paper are: (1) a complexity analysis of concept satisfiability and subsumption for a wide class of concept languages; (2) algorithms for these inferences that comply with the worst-case complexity of the reasoning task they perform.

366 citations


Book ChapterDOI
03 Jun 1991
TL;DR: The definition provides a simple, and yet powerful, way to annotate state-transition graphs with timing constraints using finitely many real-valued clocks to model the behavior of real-time systems over time.
Abstract: We propose timed automata to model the behavior of real-time systems over time. Our definition provides a simple, and yet powerful, way to annotate state-transition graphs with timing constraints using finitely many real-valued clocks. A timed automaton accepts timed words — strings in which a real-valued time of occurrence is associated with each symbol. We study timed automata from the perspective of formal language theory: we consider closure properties, decision problems, and subclasses. We discuss the application of this theory to automatic verification of real-time requirements of finite-state systems.

279 citations


Book
01 Jan 1991
TL;DR: In this paper, an introduction for undergraduates to the theory of computation is given, which emphasizes formal languages, automata and abstract models of computation, and computability. But it also includes an introduction to computational complexity and NP-completeness.
Abstract: From the Publisher: This book is an introduction for undergraduates to the theory of computation. It emphasizes formal languages,automata and abstract models of computation,and computability. It also includes an introduction to computational complexity and NP-completeness.

249 citations


Proceedings ArticleDOI
01 Jul 1991
TL;DR: This work introduces a temporal language that can constrain the time difference between events only with finite, yet arbitrary, precision and show the resulting logic to be EXPSPACE-complete.
Abstract: The most natural, compositional, way of modeling real-time systems uses a dense domain for time. The satistiability of timing constraints that are capable of expressing punctuality in this model, however, is known to be undecidable. We introduce a temporal language that can constrain the time difference between events only with finite, yet arbitrary, precision and show the resulting logic to be EXPSPACE-complete. This result allows us to develop an algorithm for the verification of timing properties of real-time systems with a dense semantics. Categories and Subject Descriptors: C.3 (Special-Purpose and Application-Based Systems)--real- time systems; D.2.1 (Software Engineeriogk Requirements/Specifications-languages; F.3.1 (Lugics and Meanings of ~rugmms): Specifying and Verifying and Reasoning about Programs-lo& of programs; mechanical verification; specification techniques; F.4.3 (Mathematical Logic and Formal Languages): Formal Languages-classes f decision problems

231 citations


Journal ArticleDOI
TL;DR: Two approaches are presented for integrating structured analysis and the Vienna development method as surrogates for informal and formal languages, respectively, and the issues that emerge from the use of the two approaches are reported.
Abstract: The differences between informal and formal requirements specification languages are noted, and the issue of bridging the gap between them is discussed. Using structured analysis (SA) and the Vienna development method (VDM) as surrogates for informal and formal languages, respectively, two approaches are presented for integrating the two. The first approach uses the SA model of a system to guide the analyst's understanding of the system and the development of the VDM specifications. The second approach proposes a rule-based method for generating VDM specifications from a set of corresponding SA specifications. The two approaches are illustrated through a simplified payroll system case. The issues that emerge from the use of the two approaches are reported. >

144 citations


Proceedings ArticleDOI
15 Jul 1991
TL;DR: A general modularity result, which allows as particular cases primitive recursive functionals of higher types, transfinite recursion of highertypes, and inheritance for all types, is proved.
Abstract: The combination of polymorphically typed lambda-calculi with first-order as well as higher-order rewrite rules is considered. The need of such a combination for exploiting the benefits of algebraically defined data types within functional programming is demonstrated. A general modularity result, which allows as particular cases primitive recursive functionals of higher types, transfinite recursion of higher types, and inheritance for all types, is proved. The class of languages considered is first defined, and it is shown how to reduce the Church-Rosser and termination properties of an algebraic functional language to a so-called principal lemma whose proof depends on the property to be proved and on the language considered. The proof of the principal lemma is then sketched for various languages. The results allow higher order rules defining the higher-order constants by a certain generalization of primitive recursion. A prototype of such primitive recursive definitions if provided by the definition of the map function for lists. >

139 citations


Journal ArticleDOI
TL;DR: A proof method based on a notion of transfinite semantic trees is presented and it is shown how to apply it to prove the completeness of refutational theorem proving methods for first order predicate calculus with equality.
Abstract: In this paper, a proof method based on a notion of transfinite semantic trees is presented and it is shown how to apply it to prove the completeness of refutational theorem proving methods for first order predicate calculus with equality. To demonstrate how this method is used, the completeness of two theorem-proving strategies, both refinements of resolution and paramodulation, are proved. Neither of the strategies need the functionally reflexive axioms nor paramodulating into variables. Therefore the Wos-Robinson conjecture follows as a corollary. Another strategy for Horn logic with equality is also presented.

134 citations


Book ChapterDOI
01 Jan 1991
TL;DR: The situation calculus as discussed by the authors is a methodology for expressing facts about action and change in formal languages of mathematical logic, and it involves expressions for situations and actions (events), and the function Result that relates them to each other.
Abstract: The situation calculus [8] is a methodology for expressing facts about action and change in formal languages of mathematical logic. It involves expressions for situations and actions (events), and the function Result that relates them to each other. The possibilities and limitations of this methodology have never been systematically investigated, and some of the commonly accepted views on this subject seem to be inaccurate.

92 citations


Book ChapterDOI
02 Jan 1991
TL;DR: A number of ideas that originated an evolution of programming from arts and crafts to a science are presented, and an axiomatic definition of program execution is introduced.
Abstract: Publisher Summary This chapter presents a number of ideas that originated an evolution of programming from arts and crafts to a science. The chapter describes computer arithmetic in two stages. In the first stage, axioms are given for arithmetic operations on natural numbers, which are valid independently of their computer representation, and choices of supplementary axioms are proposed for characterizing various possible implementations. In the second part, an axiomatic definition of program execution is introduced. An axiomatic approach is indispensable for achieving program reliability. The usefulness of program proving is advocated in view of the cost of programming errors and program testing. The chapter discusses the definition of formal language. The axioms and rules of inference can be understood as the ultimate definitive specification of the meaning of the language.

86 citations


Patent
27 Nov 1991
TL;DR: In this paper, a natural language process can be made in a way that a chained functions structure expressing relationships among concepts of elements of the natural language constructing the character strings is generated, a rule table which stores in advance rules of association relationship between the chained function structure and form of a command language which the processor system can execute is looked up, the generated chain functions structure is converted to a form which the process can then be executed, and the command language are obtained as results of the analysis by incorporating the information as to the operating statuses of a processor system into the converted form
Abstract: The invention is featured in that the natural language process can be made in a way that a chained functions structure expressing relationships among concepts of elements of the natural language constructing the character strings is generated, a rule table which stores in advance rules of association relationship between the chained functions structure and form of a command language which the processor system can execute is looked up, the generated chained functions structure is converted to a form which the process can be executed, and the command language are obtained as results of the analysis by incorporating the information as to the operating statuses of the processor system into the converted form. The invention enables the computer to operate in a desired mode with input of usual natural language without use of any of the formal languages usually given to the computer (special languages and command defined for the computer).

Proceedings ArticleDOI
Fernando Pereira1
18 Jun 1991
TL;DR: In this article, a finite-state approximation algorithm for context-free grammars and equivalent augmented phrase-structure grammar formalisms is presented. But it is not suitable for use as language models in real-time speech recognition.
Abstract: Phrase-structure grammars are an effective representation for important syntactic and semantic aspects of natural languages, but are computationally too demanding for use as language models in real-time speech recognition. An algorithm is described that computes finite-state approximations for context-free grammars and equivalent augmented phrase-structure grammar formalisms. The approximation is exact for certain context-free grammars generating regular languages, including all left-linear and right-linear context-free grammars. The algorithm has been used to construct finite-state language models for limited-domain speech recognition tasks.

Journal ArticleDOI
01 Sep 1991
TL;DR: The authors describe extensions to SML that allow a compositional approach to model checking which can substantially reduce its complexity and discuss the specification and verification of a simple central-processing-unit (CPU) controller.
Abstract: The authors consider the state machine language (SML) for describing complex finite state hardware controllers. It provides many of the standard control structures found in modern programming languages. The state tables produced by the SML compiler can be used as input to a temporal logic model checker that can automatically determine whether a specification in the logic CTL is satisfied. The authors describe extensions to SML for the design of modular controllers. These extensions allow a compositional approach to model checking which can substantially reduce its complexity. To demonstrate these methods, the authors discuss the specification and verification of a simple central-processing-unit (CPU) controller. >

Journal ArticleDOI
H. Ney1
TL;DR: A dynamic programming algorithm for recognizing and parsing spoken word strings of a context-free grammar that performs all functions simultaneously, namely, time alignment, work boundary detection, recognition, and parsing, provides a closed-form solution.
Abstract: The use of context-free grammars in automatic speech recognition is discussed A dynamic programming algorithm for recognizing and parsing spoken word strings of a context-free grammar is presented The time alignment is incorporated in to the parsing algorithm The algorithm performs all functions simultaneously, namely, time alignment, work boundary detection, recognition, and parsing As a result, no postprocessing is required From the probabilistic point of view, the algorithm finds the most likely explanation or derivation for the observed input string, which amounts to Viterbi scoring rather than Baum-Welch scoring in the case of regular or finite-state languages The algorithm provides a closed-form solution The computational complexity of the algorithm is studied Details of the implementation and experimental tests are described >

Journal ArticleDOI
01 Feb 1991
TL;DR: In this paper, it was shown that the class of concepts defined by formal systems consisting of at mostn expressions is monotonic, i.e., a formal system can define only finitely many concepts for any finite set X and anyn.
Abstract: A formal system is a finite set of expressions, such as a grammar or a Prolog program. A semantic mapping from formal systems to concepts is said to be monotonic if it maps larger formal systems to larger concepts. A formal system Γ is said to be reduced with respect to a finite setX if the concept defined by Γ containsX but the concepts defined by any proper subset Γ′ of Γ cannot contain some part ofX. Assume a semantic mapping is monotonic and formal systems consisting of at mostn expressions that are reduced with respect toX can define only finitely many concepts for any finite setX and anyn. Then, the class of concepts defined by formal systems consisting of at mostn expressions is shown to be inferable from positive data. As corollaries, the class of languages defined by length-bounded elementary formal systems consisting of at most,n axioms, the class of languages generated by context-sensitive grammars consisting of at mostn productions, and the class of minimal models of linear Prolog programs consisting of at mostn definite clauses are all shown to be inferable from positive data.

Journal ArticleDOI
TL;DR: The syntax, lexicon, and semantics of a formal language are analogous to the configuration, components, and behavior of an engineering design and the computational complexity of various grammatical formalisms might provide a foundation upon which to base complexity measures in design.
Abstract: A grammar is a definition of a language written in a transformational form. To the extent that design requirements and designed artifacts can be represented by some language, and to the extent that design is a transformation from function to form, grammars might facilitate the development of theories and methods for design. The syntax, lexicon, and semantics of a formal language are analogous to the configuration, components, and behavior of an engineering design. Furthermore, the computational complexity of various grammatical formalisms might provide a foundation upon which to base complexity measures in design. We discuss grammatical formalisms and give examples of how grammars might facilitate design automation.

Journal ArticleDOI
TL;DR: It is shown that, for unambiguous context-free languages such a computation is “easy” and can be carried out by efficient parallel algorithms, on the contrary, for some context- free languages of ambiguity degree two, the problem becomes intractable.

Journal ArticleDOI
TL;DR: A representation of trace monoids in terms of a subclass of labelled event structures, a string-based formalism for describing the behaviour of distributed systems, is obtained.

Journal ArticleDOI
TL;DR: It is shown thatNon-determinism resolves some difficulties concerning the expressive power of deterministic languages: there are non-deterministic languages expressing low complexity classes of queries/updates, whereas no such deterministic Languages are known.
Abstract: The use of non-determinism in logic-based languages is motivated using pragmatic and theoretical considerations. Non-deterministic database queries and updates occur naturally, and there exist non-deterministic implementations of various languages. It is shown that non-determinism resolves some difficulties concerning the expressive power of deterministic languages: there are non-deterministic languages expressing low complexity classes of queries/updates, whereas no such deterministic languages are known. Various mechanisms yielding non-determinism are reviewed. The focus is on two closely related families of non-deterministic languages. The first consists of extensions of Datalog with negations in bodies and/or heads of rules, with non-deterministic fixpoint semantics. The second consists of non-deterministic extensions of first-order logic and fixpoint logics, using thewitness operator. The expressive power of the languages is characterized. In particular, languages expressing exactly the (deterministic and non-deterministic) queries/updates computable in polynomial time are exhibited, whereas it is conjectured that no analogous deterministic language exists. The connection between non-deterministic languages and determinism is also explored. Several problems of practical interest are examined, such as checking (statically or dynamically) if a given program is deterministic, detecting coincidence of deterministic and non-deterministic semantics, and verifying termination for non-deterministic programs.

Journal ArticleDOI
TL;DR: The approach establishes the formal connection of rules to Chomsky grammars and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy to allow for a unified representation of stochastic and syntactic pattern constraints.
Abstract: A general method is proposed for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach establishes the formal connection of rules to Chomsky grammars and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibbs representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochastic diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs probability law. This coupling yields the result that fully parallel stochastic cellular automata can be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determine the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images. >

Proceedings ArticleDOI
15 Jul 1991
TL;DR: The study starts with a language L, a set of interpretations M and a satisfaction relation, and the key idea is to define, for each structured theory, a preorder on interpretations.
Abstract: Starting from a logic which specifies how to make deductions from a set of sentences (a flat theory), a way to generalize this to a partially ordered bag of sentences (a structured theory) is given. The partial order is used to resolve conflicts. If phi occurs below psi , then psi is accepted only insofar as it does not conflict with phi . The study starts with a language L, a set of interpretations M and a satisfaction relation. The key idea is to define, for each structured theory, a preorder on interpretations. Models of the structured theory are defined to be maximal interpretations in the ordering. A revision operator that takes a structured theory and a sentence and returns a structured theory is defined. The consequence relation has the properties of weak monotonicity, weak cut, and weak reflexivity with respect to this operator, but fails their strong counterparts. >


Proceedings Article
01 Jan 1991
TL;DR: The purpose is to exhibit a modular systematic method of representing nonmonotonic problems with the Well Founded semantics of logic programs and use this method to represent and solve some classical nonmonotsonic problems.
Abstract: Well Founded Semantics is adequate to capture nonmonotonic reasoning if we interpret the Well Founded model of a program P as a (possibly incomplete) view of the world. Thus the Well Founded model may be accepted to be a definite view of the world and the extended stable models as alternative enlarged consistent belief models an agent may have about the world. Our purpose is to exhibit a modular systematic method of representing nonmonotonic problems with the Well Founded semantics of logic programs. In this paper we use this method to represent and solve some classical nonmonotonic problems. This leads us to consider our method quite generic.

Journal ArticleDOI
TL;DR: A more general result is shown that given an arbitrary integer k > 0 and k nonnegative integers there exist k regular languages L 1 , L k such that L 1 is accepted by an n 1 -state DFA and any DFA accepting nl
Abstract: The following problem has been considered in [3] and [1] : For n regular languages each of which is accepted by an n-state DFA, what is the number of states of a minimum DFA that accepts th e intersection of the n languages in the worst case? Birget in [1] tried to prove that the lower boun d for the above problem is nn . Unfortunately, his proof is incorrect . In the following, we first show what is wrong in Birget's proof, and then we give two proofs for the problem . The first proof i s a modification of Birget's, which shows that nn — n + 1 is a lower bound for the problem . In the second proof, we use a very different approach and show that nn is indeed the lower bound for the problem in the worst case . With the second approach, we are able to show a more general result that given an arbitrary integer k > 0 and k nonnegative integers there exist k regular languages L 1 , . . . , L k such that L 1 is accepted by an n 1 -state DFA, . . ., Lk is accepted by an nk-state DFA and any DFA accepting nl 1 . Denote the set of all binary relations on N by Bn and the set of all total functions on N by Fn . For R1 , R2 E Bn , R I • R 2 denotes the compositio n of R 1 and R2 . Birget uses the following n regular languages in his proof in [1] :

Journal ArticleDOI
TL;DR: It is shown that log-bounded rudimentary reductions (defined and studied by Jones in 1975) characterize Dlogtime-uniform AC 0.


Book ChapterDOI
21 Oct 1991
TL;DR: In this paper, the authors describe techniques for specification of switching systems, including how to model the state space of a feature-rich switching system, how to use the schema calculus for organizing a complex operation set, and how to realize the potential benefits of a partial specification.
Abstract: This paper reports on results obtained by specifying the connection patterns within a small PBX using Z We discuss techniques for specification of switching systems, including how to model the state space of a feature-rich switching system, how to use the schema calculus for organizing a complex operation set, and how to realize the potential benefits of a partial specification We also outline a new approach to constructing a specification as a composition of partial specifications written in different formal languages

Proceedings ArticleDOI
01 Sep 1991
TL;DR: Under assumptions about triple-exponential time, incoherent sets in NP are constructed in DSPACE, yielding the first uncheckable and non-random-self-reducible sets in that space.
Abstract: Languages in NP are presented for which it is harder to prove membership interactively than it is to decide this membership. Similarly, languages where checking is harder than computing membership are presented. Under assumptions about triple-exponential time, incoherent sets in NP are constructed. Without any assumptions, incoherent sets are constructed in DSPACE (n to the log n), yielding the first uncheckable and non-random-self-reducible sets in that space. >

Journal ArticleDOI
TL;DR: It is argued that the method is simple (in particular, no powerdomains are needed), yet more expressive than existing methods (it is the first exact collecting interpretation for either nonstrict higher order languages).
Abstract: A collecting interpretation of expressions is an interpretation of a program that allows one to answer questions of the sort: “What are all possible values to which an expression might evaluate during program execution?” Answering such questions in a denotational framework is akin to traditional data flow analysis and, when used in the context of abstract interpretation, allows one to infer properties that approximate the run-time behavior of expression evaluation.Exact collecting interpretations of expressions are developed for three abstract functional languages: a strict first-order language, a nonstrict first-order language, and a nonstrict higher order language (the full untyped lambda calculus with constants). It is argued that the method is simple (in particular, no powerdomains are needed), Natural (it captures the intuitive operational behavior of a cache), yet more expressive than existing methods (it is the first exact collecting interpretation for either nonstrict higher order languages). Correctness of the interpretations with respect to the standard semantics is shown via a generalization of the notion of strictness. It is further shown how to form abstractions of these exact interpretations, using as an example a collecting strictness analysis which yields compile-time information not previously captured by conventional strictness analyses.

Book ChapterDOI
04 Jun 1991
TL;DR: This paper first point out the basic concepts of computations in tree contexts and their dependencies, then more elaborate methods are presented for systematic development of AG specification.
Abstract: Attribute Grammars (AGs) are a formal and practical method for rule based specifications of computations on tree structures. A typical application area is the analysis and translation of formal languages. In this paper we first point out the basic concepts of computations in tree contexts and their dependencies. Then more elaborate methods are presented for systematic development of AG specification.