Topic
Formal grammar
About: Formal grammar is a research topic. Over the lifetime, 1262 publications have been published within this topic receiving 28796 citations.
Papers published on a yearly basis
Papers
More filters
•
01 Jul 1978TL;DR: This volume intended to serve as a text for upper undergraduate and graduate level students and special emphasis is given to the role of algebraic techniques in formal language theory through a chapter devoted to the fixed point approach to the analysis of context-free languages.
Abstract: From the Publisher:
Formal language theory was fist developed in the mid 1950's in an attempt to develop theories of natural language acquisition. It was soon realized that this theory (particularly the context-free portion) was quite relevant to the artificial languages that had originated in computer science. Since those days, the theory of formal languages has been developed extensively, and has several discernible trends, which include applications to the syntactic analysis of programming languages, program schemes, models of biological systems, and relationships with natural languages.
1,415 citations
01 Jan 1990
TL;DR: The chapter describes the framework of linear temporal logic, which has been widely employed in the specification and verification of programs and explains how temporal logic structures can be used to model concurrent programs using non-determinism and fairness.
Abstract: Publisher Summary This chapter discusses temporal and modal logic. The chapter describes a multiaxis classification of systems of temporal logic. The chapter describes the framework of linear temporal logic. In both its propositional and first-order forms, linear temporal logic has been widely employed in the specification and verification of programs. The chapter describes the competing framework of branching temporal logic, which has seen wide use. It also explains how temporal logic structures can be used to model concurrent programs using non-determinism and fairness. The chapter also discusses other modal and temporal logics in computer science. The chapter describes the formal syntax and semantics of Propositional Linear Temporal Logic (PLTL). The chapter also describes the formal syntax and semantics for two representative systems of propositional branching-time temporal logics.
966 citations
••
TL;DR: This chapter discusses the several classes of sentence-generating devices that are closely related, in various ways, to the grammars of both natural languages and artificial languages of various kinds.
Abstract: Publisher Summary This chapter discusses the several classes of sentence-generating devices that are closely related, in various ways, to the grammars of both natural languages and artificial languages of various kinds. By a language it simply mean a set of strings in some finite set V of symbols called the vocabulary of the language. By a grammar a set of rules that give a recursive enumeration of the strings belonging to the language. It can be said that the grammar generates these strings. The chapter discusses the aspect of the structural description of a sentence, namely, its subdivision into phrases belonging to various categories. A major concern of the general theory of natural languages is to define the class of possible strings; the class of possible grammars; the class of possible structural descriptions; a procedure for assigning structural descriptions to sentences, given a grammar; and to do all of this in such a way that the structural description assigned to a sentence by the grammar of a natural language will provide the basis for explaining how a speaker of this language would understand this sentence.
819 citations
••
TL;DR: Four experiments used the head-turn preference procedure to assess whether infants could extract and remember information from auditory strings produced by a miniature artificial grammar and found infants generalized to new structure by discriminating new grammatical strings from ungrammatical ones after less than 2 min exposure to the grammar.
603 citations
••
TL;DR: This paper examined two possible bases for grammatical judgments following syntactical learning: implicit representations of a formal grammar, as in Reber's (1976) hypothesis of implicit learning, and conscious rules within informal grammars.
Abstract: This study examined two possible bases for grammatical judgments following syntactical learning: unconscious representations of a formal grammar, as in Reber's (1976) hypothesis of implicit learning, and conscious rules within informal grammars. Experimental subjects inspected strings generated by a finite-state grammar, viewed either one at a time or all at a time, with implicit or explicit learning instructions. In a transfer test, experimental and control subjects judged the grammatically of grammatical and nongrammatical strings, reporting on every trial the bases for their judgments. In replication of others' results, experimental subjects met the critical test for grammatical abstraction: significantly correct classification of novel strings. We found, however, that reported rules predicted those grammatical judgments without significant residual. Subjects evidently acquired correlated grammars, personal sets of conscious rules, each of limited scope and many of imperfect validity. Those rules themselves were shown to embody abstractions, consciously represented novelty that could account for
513 citations