scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 1988"


Book
01 Jan 1988
TL;DR: This volume is written for undergraduate students who have taken a first course in Formal Language Theory and presents the basic concepts of structural complexity, thus providing the background necessary for the understanding of complexity theory.
Abstract: This is the first volume of a systematic two-volume presentation of the various areas of research on structural complexity. The theory of algorithmic complexity, a part of the mathematical theory of computation, can be approached from several points of view, one of which is the structural one. This volume is written for undergraduate students who have taken a first course in Formal Language Theory. It presents the basic concepts of structural complexity, thus providing the background necessary for the understanding of complexity theory. The second corrected edition has been extended by an appendix with recent results on nondeterministic space classes and updated with regard to the bibliographical remarks and the references.

811 citations


Book ChapterDOI
01 Jul 1988
TL;DR: Computational techniques have proved to be powerful tools for both experimental and theoretical investigations of the mind in the sciences, where computers and computational languages have improved the authors' ability to develop and test process theories of complex natural phenomena.
Abstract: A unique aspect of computers is that they not only represent process but also naturally keep track of the actions used to carry out a given task, so that the process with its trace can become an object of study in its own right. One effect of this can be seen vividly in the sciences, where computers and computational languages have improved our ability to develop and test process theories of complex natural phenomena. Before powerful computers became readily available as scientific tools, process models were expressed in mathematical languages, such as differential equations— languages primarily effective in capturing a static “snapshot” of a process. Computation provided formal languages that are more flexible than mathematics but just as precise. In part because computation is itself dynamic, it provides an ideal medium for representing and testing richer, more varied, and more detailed theories of process. The use of this medium for process modeling has radically changed the nature of many current theories in both the physical and social sciences. Particularly in the arena of the cognitive sciences, computational techniques have proved to be powerful tools for both experimental and theoretical investigations of the mind.

298 citations


Journal ArticleDOI
TL;DR: Three ways in which formal languages can be defined by Thue systems with the Church-Rosser property are studied, and some general results about the three families of languages so determined are studied.
Abstract: Since about 1971, much research has been done on Thue systems that have properties that ensure viable and efficient computation. The strongest of these is the Church-Rosser property, which states that two equivalent strings can each be brought to a unique canonical form by a sequence of length-reducing rules. In this paper three ways in which formal languages can be defined by Thue systems with this property are studied, and some general results about the three families of languages so determined are studied.

149 citations


Journal Article
TL;DR: This work provides in effect a unitary class of data structures for the representation of syntactic categories in a range of diverse grammatical frameworks and examines the questions posed by set-valued features and sharing of values between distinct feature specifications, both of which fall outside the scope of the formal system developed in this paper.
Abstract: This paper outlines a simple and general notion of syntactic category on a metatheoretical level, independent of the notations and substantive claims of any particular grammatical framework. We define a class of formal objects called "category structures" where each such object provides a constructive definition for a space of syntactic categories. A unification operation and subsumption and identity relations are defined for arbitrary syntactic categories. In addition, a formal language for the statement of constraints on categories is provided. By combining a category structure with a set of constraints, we show that one can define the category systems of several well-known grammatical frameworks: phrase structure grammar, tagmemics, augmented phrase structure grammar, relational grammar, transformational grammar, generalized phrase structure grammar, systemic grammar, categorial grammar, and indexed grammar. The problem of checking a category for conformity to constraints is shown to be solvable in linear time. This work provides in effect a unitary class of data structures for the representation of syntactic categories in a range of diverse grammatical frameworks. Using such data structures should make it possible for various pseudo-issues in natural language processing research to be avoided. We conclude by examining the questions posed by set-valued features and sharing of values between distinct feature specifications, both of which fall outside the scope of the formal system developed in this paper.

81 citations


Proceedings ArticleDOI
18 Apr 1988
TL;DR: The author presents methods to map a formal trust specification on to mechanisms for its implementation in the distributed system, and shows how security specification and verification methods can be integrated into the presented theory of trust.
Abstract: He introduces basic notions about developing a logic or a theory, and shows that modal logics of belief, with their Kripe-style possible-worlds semantics, are appropriate for basing a theory of trust on. He reviews a modal logic of belief, and constructs a model of the distributed system so that the logic is sound and complete with respect to the model. Any sentences in the logic may then be added to the logic as axioms, and these axiomatic sentences are considered as trust specifications. He presents methods to map a formal trust specification on to mechanisms for its implementation in the distributed system. Trust and security are closely related in distributed systems. It is shown how security specification and verification methods can be integrated into the presented theory of trust. The author analyzes the trusts required in public-key-based secure communication. >

71 citations


Book ChapterDOI
11 Jul 1988
TL;DR: It is shown that the languages definable in this framework are precisely the regular languages whose syntactic monoids contain only solvable groups.
Abstract: We study an extension of first-order logic obtained by adjoining quantifiers that count with respect to an integer modulus. It is shown that the languages definable in this framework are precisely the regular languages whose syntactic monoids contain only solvable groups. We obtain an analogous result for regular ω-languages and establish some connections with complexity theory for fixed-depth families of circuits.

69 citations


Journal ArticleDOI
Howard Straubing1
TL;DR: An effective criterion for determining whether a given language has dot-depth 2 is conjecture and the condition is shown to be necessary in general, and sufficient for languages over a two-letter alphabet.

61 citations


Proceedings ArticleDOI
01 Dec 1988
TL;DR: It is shown that the problem of learning a subfamily of regular languages can be reduced to theproblem of learning its finite members and this reduction shows that the family of κ-bounded regular languages is learnable in polynomial time.
Abstract: We study the problem of learning an unknown language given a teacher which can only answer equivalence queries. The teacher presenting a language L can test (in unit time) whether a conjectured language L ′ is equal to L and, if L ′ ≠ L , provide a counterexample (i.e., a string in the symmetric difference of L and L ′). It has recently been shown that the family of regular languages and the family of pattern languages are not learnable in polynomial time under this protocol. We consider the learnability of subfamilies of regular languages. It is shown that the problem of learning a subfamily of regular languages can be reduced to the problem of learning its finite members. Using this reduction, we show that the family of κ-bounded regular languages is learnable in polynomial time. We investigate how a partial ordering on counterexamples affects the learnability of the family of regular languages and the family of pattern languages. Two partial orderings are considered: ordering by length and lexicographical ordering. We show that the first ordering on counterexamples does not reduce the complexity of learning the family of regular languages. In contrast, the family of pattern languages is learnable in polynomial time if the teacher always provides counterexamples of minimal length and the family of regular languages is learnable in polynomial time if the teacher always provides the lexicographically first counterexamples.

55 citations


Book ChapterDOI
01 Sep 1988
TL;DR: A discrete-event system is modelled as a controlled state-machine (generator) of a formal language and concepts of controllability, observability, and decentralized and hierarchical control architecture are defined and explored.
Abstract: A discrete-event system is modelled as a controlled state-machine (generator) of a formal language. Concepts of controllability, observability, and decentralized and hierarchical control architecture are defined and explored. A guide is provided to the software package TCf for controller synthesis of small systems.

49 citations


Journal ArticleDOI
01 Jun 1988
TL;DR: Strong grammatical abstraction tightens the correspondence so that top-down construction of incrementally-parsable internal representations is possible in any program that transforms a linguistic object from a representations in its concrete syntax to a representation in its abstract syntax or vice versa.
Abstract: Processors for programming languages and other formal languages typically use a concrete syntax to describe the user's view of a program and an abstract syntax to represent language structures internally. Grammatical abstraction is defined as a relationship between two context-free grammars. It formalizes the notion of one syntax being “more abstract” than another. Two variants of abstraction are presented. Weak grammatical abstraction supports (i) the construction during LR parsing of an internal representation that is closely related to the abstract syntax and (ii) incremental LR parsing using that internal representation as its base. Strong grammatical abstraction tightens the correspondence so that top-down construction of incrementally-parsable internal representations is possible. These results arise from an investigation into language-based editing systems, but apply to any program that transforms a linguistic object from a representation in its concrete syntax to a representation in its abstract syntax or vice versa.

31 citations


Proceedings ArticleDOI
01 Jan 1988
TL;DR: A model of international electronic contracting is proposed whereby parties express contractual terms and conditions in a restricted form of their native language into a formalized language, called Candid, for transmission over the network.
Abstract: A model of international electronic contracting is proposed whereby parties express contractual terms and conditions in a restricted form of their native language. This is parsed into a formalized language, called Candid, for transmission over the network. At the receiving end, the Candid representation is rephrased into a similarly restricted form of the receiver's native language. An operating prototype is presented with examples in English, German, and Portuguese. Further extensions are discussed. >

Journal ArticleDOI
TL;DR: It is shown that this problem remains undecidable even if the Thue systems under consideration contain only length-reducing rules plus a single length-preserving rule, which is a commutation rule of the form ( ab, ba ), where a and b are distinct letters.

Proceedings ArticleDOI
14 Jun 1988
TL;DR: It is shown that the two notions of uniformity are equivalent, leading to a natural notion of uniformities for low-level circuit complexity classes, and that recent results on the structure of NC/sup 1/ still hold true in this very uniform setting.
Abstract: The study of circuit complexity classes within NC/sup 1/ in a uniform setting requires a uniformity condition that is more restrictive than those in common use. Two such conditions, stricter than NC/sup 1/ uniformity, have appeared in recent research. It is shown that the two notions are equivalent, leading to a natural notion of uniformity for low-level circuit complexity classes, and that recent results on the structure of NC/sup 1/ still hold true in this very uniform setting. A parallel notion of uniformity, still more restrictive, that is based on the regular languages is investigated. Characterizations of subclasses of the regular languages based on their logical expressibility are given. >

Book ChapterDOI
Lars Löfgren1
01 Jan 1988
TL;DR: In this paper, the linguistic complementarity is taken as a basis for a general concept of language, permitting particularizations like programming languages, formal languages, genetic languages, and natural communication languages.
Abstract: Early cybernetics emphasized control and communication in the animal and the machine. Subsequent understandings of linguistic phenomena in the animal have shown them not to be reducible to purely mechanistic models. The linguistic complementarity, with its possibilities for transcendence, provides such an understanding, indicating relativistic approaches within modern systems theory. Comparisons are made with Bohr’s concept of complementarity for quantum physics, again an area where linguistic objectifications are developing. The linguistic complementarity is taken as a basis for a general concept of language, permitting particularizations like programming languages, formal languages, genetic languages, and natural communication languages.

Book ChapterDOI
01 Jan 1988
TL;DR: An overview of the modeling of discrete event systems using formal languages is presented and a standard coordination problem for a product system is shown to be of polynomial complexity.
Abstract: We present an overview of the modeling of discrete event systems using formal languages. Some new results on the controllability of sequential behaviors are presented and a standard coordination problem for a product system is shown to be of polynomial complexity.

Book
01 Sep 1988
TL;DR: A global introduction to language technology and the areas of computer science where language technology plays a role, and the social forces which influenced the development of the various topics.
Abstract: A global introduction to language technology and the areas of computer science where language technology plays a role. Surveyed in this volume are issueas related to the parsing problem in the fields of natural languages, programming languages, and formal languages. Throughout the book attention is paid to the social forces which influenced the development of the various topics. Also illustrated are the developments of the theory of language analysis, its role in compiler constraction, and its role in computer applications with a natural language interface between men and machine. Parts of the material in this book have been used in courses on computational linguistics, computers and society, and formal approaches to languages.

11 Jul 1988
TL;DR: A general specification language, Z, based on set theory and developed at Oxford University is presented and some conclusions are drawn about the advantages and disadvantages of using a formal approach.
Abstract: A general specification language, Z, based on set theory and developed at Oxford University is presented. A major advantage of a formal notation is that it is precise and unambiguous and thus the formal notation always provides the definitive description in the case of any misunderstanding. A number of examples are discussed, including network services, window systems, and microprocessor instruction sets. This paper is split into two main parts. The first half deals with the nature of formal specification and why it should be used. Additionally, a brief introduction to Z and how it is used is also presented in general terms, without covering the notation itself. The second half of the paper deals with the experiment gained using Z for the design and documentation of network services and during some case studies of existing systems. Finally some conclusions are drawn about the advantages and disadvantages of using a formal approach.

Book
01 Jan 1988
TL;DR: This chapter discusses Parsing, which is concerned with the construction of Strong LL(k) Parsers, and its applications to Lexical Analysis, as well as other aspects of language theory.
Abstract: 1 Elements of Language Theory- 11 Mathematical Preliminaries- 12 Languages- 13 Random Access Machines- 14 Decision Problems- 15 Computational Complexity- 16 Rewriting Systems- Exercises- Bibliographic Notes- 2 Algorithms on Graphs- 21 Basic Algorithms- 22 Finding Strongly Connected Components- 23 Computing Functions Defined on Graphs- 24 Computing Relational Expressions- Exercises- Bibliographic Notes- 3 Regular Languages- 31 Regular Expressions- 32 Finite Automata- 33 Regular Grammars- 34 Deterministic Finite Automata- 35 Decision Problems on Regular Languages- 36 Applications to Lexical Analysis- Exercises- Bibliographic Notes- 4 Context-free Languages- 41 Context-free Grammars- 42 Leftmost and Rightmost Derivations- 43 Ambiguity of Grammars- 44 Useless and Nullable Symbols- 45 Canonical Two-form Grammars- 46 Derivational Complexity- 47 Context-free Language Recognition- Exercises- Bibliographic Notes- 5 Parsing- 51 Pushdown Automata- 52 Left Parsers and Right Parsers- 53 Strong LL(k) Parsing- 54 Strong LL(k) Grammars- 55 Construction of Strong LL(1) Parsers- 56 Implementation of Strong LL(1) Parsers- 57 Simple Precedence Parsing- Exercises- Bibliographic Notes- Bibliography to Volume I- Index to Volume I

Book ChapterDOI
21 Mar 1988
TL;DR: The PSG programming system generator developed at the Technical University of Darmstadt as discussed by the authors produces interactive, language-specific programming environments from formal language definitions, including a full-screen editor, which allows both structure and text editing.
Abstract: The PSG programming system generator developed at the Technical University of Darmstadt produces interactive, language-specific programming environments from formal language definitions. All language-dependent parts of the environment are generated from an entirely nonprocedural specification of the language's syntax, context conditions, and dynamic semantics. The generated environment consists of a language-based editor, supporting systematic program development by named program fragments, an interpreter, and a fragment library system. The major component of the environment is a full-screen editor, which allows both structure and text editing. In structure mode the editor guarantees prevention of both syntactic and semantic errors, whereas in textual mode it guarantees their immediate recognition. PSG editors employ a novel algorithm for incremental semantic analysis which is based on unification. The algorithm will immediately detect semantic errors even in incomplete program fragments. The dynamic semantics of the language are defined in denotational style using a functional language based on the lambda calculus. Program fragments are compiled to terms of the functional language which are executed by an interpreter. The PSG generator has been used to produce environments for Pascal, ALGOL 60, MODULA-2, and the formal language definition language itself.

Journal ArticleDOI
Jarkko Kari1
TL;DR: A cryptanalytic method applicable to some public-key cryptosystems based on the theory of formal languages and systems based on iterated morphisms and repeated finite substitutions, [2,3], will be discussed.

Journal ArticleDOI
TL;DR: On montre que tout langage contextuel peut être considere comme une classe de structures finies avec de bonnes proprietes de la theorie des modeles d'etirement independants des axiomes forts.
Abstract: On montre que tout langage contextuel peut etre considere comme une classe de structures finies avec de bonnes proprietes de la theorie des modeles On definit les langages locaux libres On etudie les phrases locales qui satisfont a des theoremes d'etirement independants des axiomes forts

Proceedings ArticleDOI
05 Jul 1988
TL;DR: A precise notion of a formal framework meant to capture the intuition of an open-ended range of deductive interpreted languages is proposed, and a particular framework called the logical theory of constructions (LTC) is developed as an example.
Abstract: A precise notion of a formal framework, meant to capture the intuition of an open-ended range of deductive interpreted languages, is proposed. A particular framework called the logical theory of constructions (LTC) is developed as an example. A series of languages in the LTC framework is defined, demonstrating how a language can be thought of as gradually evolving. >

Proceedings ArticleDOI
11 Apr 1988
TL;DR: The author deals with the properties of a tool called G/sup 2/F, an editor generator for two-dimensional graphical formulas that facilitates syntactic-correctness-preserving operations on the abstract syntax trees of formulas and produces hardcopies of whole operation sequences on a laser printer.
Abstract: The author deals with the properties of a tool called G/sup 2/F (an editor generator for two-dimensional graphical formulas). G/sup 2/F makes it possible to define two-dimensional grammars graphically and to generate a corresponding syntax-directed editor. It facilitates syntactic-correctness-preserving operations on the abstract syntax trees of formulas and produces hardcopies of whole operation sequences on a laser printer. Thus, G/sup 2/F can be used to create user interfaces for a variety of applications. It is well suited to support a clear and surveyable representation of complex expressions which occur in every formal framework and to invoke procedures of an application transforming its abstract syntax. >

Journal ArticleDOI
TL;DR: The following vehicles for communication (forms of specification language) are discussed: graphic diagrams, tables, natural language and formal languages, and the various forms of expression are compared, and weakenesses and strengths are highlighted.

Book ChapterDOI
04 Sep 1988

Proceedings ArticleDOI
22 Aug 1988
TL;DR: This paper describes Word Manager, a system which supports the definition, access and maintenance of lexical databases and comprises a formal language for the implemenation of morphological knowledge which is integrated in a graphics-oriented, high-level user interface and is language independent.
Abstract: This paper describes Word Manager, a system which is currently the object of a research project at the University of Zurich Computer Science Department. Word Manager supports the definition, access and maintenance of lexical databases. It comprises a formal language for the implemenation of morphological knowledge. This formal language is integrated in a graphics-oriented, high-level user interface and is language independent. The project is now in the prototyping phase where parts of the software are pretty far advanced (the user interface) and others are still rudimentary (the rule compiler/runtime system). The design of the system was strongly influenced by Koskenniemi's two-level model /Koskenniemi 1983/, its successors /Bear 1986/, /Black 1986/, /Borin 1986/, /Darymple 1987/, the ANSI-SPARC 3-Schema Concept /ANSI-X3-SPARC 1975/ and visual programming techniques /Bocker 1986/, /Myers 1986/; We will focus the discussion on one aspect: the user interfacing for the construction of the lexical data base.

Journal ArticleDOI
TL;DR: Embodiments of key theoretical concepts such as directed graphs, automata, and cellular automata have many potential uses, including simplifying the development of simulation models; permitting greater use of formal languages as modelling tools; and stimulating the development in particular areas of application.

Book ChapterDOI
01 Jan 1988
TL;DR: This chapter shall discuss the elements of discrete mathematics and formal language theory, emphasizing those issues that are of importance from the point of view of context-free parsing, and devote a considerable part of this chapter to matters such as random access machines and computational complexity.
Abstract: In this chapter we shall review the mathematical and computer science background on which the presentation in this book is based. We shall discuss the elements of discrete mathematics and formal language theory, emphasizing those issues that are of importance from the point of view of context-free parsing. We shall devote a considerable part of this chapter to matters such as random access machines and computational complexity. These will be relevant later when we derive efficient algorithms for parsing theoretic problems or prove lower bounds for the complexity of these problems. In this chapter we shall also discuss a general class of formal language descriptors called “rewriting systems” or “semi-Thue systems”. Later in the book we shall consider various language descriptors and language recognizers as special cases of a general rewriting system. As this approach is somewhat unconventional, we advise even the experienced reader to go through the definitions given in this chapter if he or she wishes to appreciate fully the presentation in this book.

Proceedings ArticleDOI
01 Feb 1988
TL;DR: The course is best positioned within the curriculum at the Junior level, recognizing that Junior level students are rarely mathematically sophisticated, and the treatment is not as rigorous as that of a more advanced course on the theory of computation.
Abstract: Theory of computation courses have traditionally been taught at the advanced-undergraduate/graduate level, primarily due to the level of mathematical rigor associated with the topics involved. The topics covered include automata theory, formal languages, computability, uncomputability, and computational complexity. If the essentials of these topics are introduced earlier in the undergraduate computer science curriculum, students gain deeper insights and better comprehend the underlying computational issues associated with the material covered in subsequent computer science courses. Such a course is required of all computer science majors at the University of North Florida. Experience has demonstrated that a minimum background for the course includes Freshman-Sophomore mathematics (presently calculus) and a typical introduction to computer science. Thus the course is best positioned within the curriculum at the Junior level. Recognizing that Junior level students are rarely mathematically sophisticated, the treatment is not as rigorous as that of a more advanced course on the theory of computation. To reinforce the “theory” covered in class, an integral portion of the course is devoted to “hands-on” exercises using simulation tools designed for construction of a variety of automata. The exercises generally require the construction of automata of various forms, with observation of their step by step operation. Further exercises illustrate the connections between various automata and areas such as hardware design and compiler construction. The paper describes the course and the nature of the simulation tools used in the “hands-on” component of the course.

Journal ArticleDOI
TL;DR: The formal semantics of a novel language, called EqL, are presented for first-order functional and Horn logic programming, and the operational semantics for solving equations is an extension of reduction, called object refinement.
Abstract: The formal semantics of a novel language, called EqL, are presented for first-order functional and Horn logic programming. An EqL program is a set of conditional pattern-directed rules, where the conditions are expressed as a conjunction of equations. The programming paradigm provided by this language may be called equational programming. The declarative semantics of equations is given in terms of their complete set of solutions, and the operational semantics for solving equations is an extension of reduction, called object refinement. The correctness of the operational semantics is established through the soundness and completeness theorems. Examples are given to illustrate the language and its semantics. >