scispace - formally typeset
Search or ask a question

Showing papers on "Formal grammar published in 1992"


Journal ArticleDOI
TL;DR: Givon's Syntax: A Functional-Typological Introduction II by Talmy Givon as discussed by the authors is the second volume of a two-volume syntactic survey of language from a functional perspective.
Abstract: Reviews Syntax: A Functional-Typological Introduction II by Talmy Givon. Philadelphia: John Benjamins 1990. Reviewed by Howard Williams University of California, Los Angeles Syntax: Volume II volume morphological and the second book in Givon's two- syntactic survey of language from a functional perspective. (For a review of the first volume, see Heath, is 1986) As a functionally-oriented grammarian, Givon concerns himself not with formal syntax but with the systematic uses to which constructions are put. Syntax is for him functional in a strong sense: the form of language is claimed to be a direct reflection of users' communicative needs at all levels of analysis. While the heavily English-oriented second volume may be read independently of the first, some grounding is in order. For Givon, the levels of analysis appropriate to syntax are the discourse-pragmatic, the propositional-semantic, the lexical-semantic and the phrasal- semantic; the four have individual requirements which occasionally conflict. To understand syntax is to understand these levels and the conflicts among them. Knowledge of diachronic change is also essential to a proper understanding of structure. Chapter 12, the opening chapter, deals with the coherence of noun phrases (NPs). The order of pre- and post-nominal modifiers is held to be determined on a scale of relevance as in Bybee (1985); there is a partial parallel to the placement of complements and adjuncts in formal approaches. Elements of NPs tend to be contiguous rather than scattered through a clause for iconic reasons, to preserve functional unity. Conjunction of NPs is limited to NPs of equal thematic status with similar case roles. Separate events will tend to be encoded by separate clauses (p. 488); a fairly detailed section illustrates the pragmatic-cognitive difficulties of this phenomenon. In a section on nominalization of clauses, a scalar order of nominal-like phrases is presented, with/ clauses at the bottom and the enemy's destruction of the c/fy-type nominals at the top. Exactly what this would mean in syntactic terms (e.g., the inability of infinitives to serve well as the subjects of yes-no questions) is not addressed. Chapter 13 deals with verbal complementation, investigating the semantic nature of the relationship between main and embedded

104 citations


Journal ArticleDOI
TL;DR: This paper gives a progression of automata and shows that it corresponds exactly to the language hierarchy defined with control grammars, the first member of which is context-free languages.

63 citations



Journal ArticleDOI
TL;DR: Dynamic parsers and growing grammars allow a syntactic-only parsing of programs written in powerful and problem adaptable programming languages and easily perform purely syntactic strong type checking and operator overloading.
Abstract: We define "evolving grammars" as successions of static grammars and dynamic parsers as parsers able to follow the evolution of a grammar during the source program parsing. A growing context-free grammar will progressively incorporate production rules specific for the source program under parsing and will evolve following the context created by the source program itself toward a program specific context-free grammar. Dynamic parsers and growing grammars allow a syntactic-only parsing of programs written in powerful and problem adaptable programming languages. Moreover dynamic parsers easily perform purely syntactic strong type checking and operator overloading. The language used to specify grammar evolution and residual semantic actions can be the evolving language itself. The user can introduce new syntactic operators using a bootstrap procedure supported by the previously defined syntax.A dynamic parser ("ZzParser") has been developed by us and has been successfully employed by the APE 100 INFN group to develop a programming language ("ApeseLanguage") and other system software tools for the 100 GigaFlops SIMD parallel machine under development.

31 citations


Journal ArticleDOI
01 Jul 1992
TL;DR: An overview of the theory of Functional Grammar is given and its value for knowledge representation and information systems is assessed by comparing its formalism with some influential theories ofknowledge representation and lexical semantics.
Abstract: Under a linguistic perspective on information systems the primary goal of such systems is to support communication and coordination in organizational contexts. The formalized communication is supposed to be grounded in an informal natural language discourse. This suggests a convergence of natural language semantics and knowledge representation. We give an overview of the theory of Functional Grammar and assess its value for knowledge representation and information systems by comparing its formalism with some influential theories of knowledge representation and lexical semantics.

12 citations


Proceedings ArticleDOI
23 Mar 1992
TL;DR: The authors present PARSEC-a system for generating connectionist parsing networks from example parses that learn to parse, generalize well compared to hand-coded grammars, and tolerate several types of noise.
Abstract: The authors present PARSEC-a system for generating connectionist parsing networks from example parses. PARSEC is not based on formal grammar systems and has been geared towards spoken language tasks. PARSEC networks exhibit three strengths important for application to speech processing: they learn to parse, and generalize well compared to hand-coded grammars; they tolerate several types of noise; and they can learn to use multimodal input. The authors also present the PARSEC architecture, its training algorithms, and performance analyses along several dimensions that demonstrate PARSEC's features. They compare PARSEC's performance to that of traditional grammar-based parsing systems. >

12 citations


Journal ArticleDOI
TL;DR: The results showed that these second language writers engaged in restating content, constructing meaning, and higher and lower-level planning as they completed sentence-combining tasks, and between-task comparisons indicated that open sentence-Combining tasks required significantly more higher- level planning than controlled sentence- Combining tasks.

12 citations


Proceedings Article
30 Nov 1992
TL;DR: Here it is shown that formal languages too can be specified by Harmonic Grammars, rather than by conventional serial rewrite rule systems.
Abstract: Basic connectionist principles imply that grammars should take the form of systems of parallel soft constraints defining an optimization problem the solutions to which are the well-formed structures in the language. Such Harmonic Grammars have been successfully applied to a number of problems in the theory of natural languages. Here it is shown that formal languages too can be specified by Harmonic Grammars, rather than by conventional serial rewrite rule systems.

9 citations


Journal ArticleDOI
TL;DR: The paper presents an overview of the formalism, followed by examples of its usage in linguistic applications, and a critical evaluation is presented, in which the authors discuss the shortcomings of the system and present the directions that are being taken to achieve a more realistic, and more simplified, model of machine translation.
Abstract: A description of the CAT2 machine translation system is presented as an example of a model which stresses simplicity over complexity. The design of the formalism encompasses a minimum of formal constructs, yet is powerful enough to describe complex linguistic and translational phenomena. The paper presents an overview of the formalism, followed by examples of its usage in linguistic applications. A critical evaluation is presented, in which the authors discuss the shortcomings of the system and present the directions that are being taken to achieve a more realistic, and more simplified, model of machine translation.

7 citations



Proceedings ArticleDOI
30 Aug 1992
TL;DR: In this paper, for several formal languages (of various types) the equivalent transformation systems are presented and it can be drawn that the transformation systems give shorter and more informative structural class descriptions than the formal grammars.
Abstract: The concept of the transformation system was introduced earlier by the author as a basic part of a general model for pattern learning. In this paper, for several formal languages (of various types) the equivalent transformation systems are presented. From these examples one can draw the conclusion that the transformation systems give shorter and more informative structural class descriptions than the formal grammars. >

Journal ArticleDOI
TL;DR: Different notions of acceptance of a sentential form as a solution in cooperating/distributed grammar systems are introduced and statements on the hierarchy according to the number of grammars in the systems with different acceptance styles are presented.
Abstract: Cooperating/distributed grammar systems are a formal model of blackboard architectures for problem solving. We introduce different notions of acceptance of a sentential form as a solution. We compare them with respect to the generative power and present statements on the hierarchy according to the number of grammars in the systems with different acceptance styles.


Book ChapterDOI
07 Sep 1992
TL;DR: The concepts and techniques used for two of Natural Language Processing's main tasks, viz. syntax analysis and semantic analysis, are illustrated with concrete examples.
Abstract: Natural Language Processing, one of the most important branches of Artificial Intelligence, has close links with Logic Programming. It is one of the two roots from which Logic Programming developed, and still shares important core concepts with it (e.g. unification). The concepts and techniques used for two of its main tasks, viz. syntax analysis and semantic analysis, are illustrated with concrete examples. For syntax analysis, it is the notations of Definite Clause Grammars and Extraposition Grammars, for semantic analysis, the concepts of Montague Grammar and Generalized Quantifier Theory. Some of these techniques can also be applied to the analysis of formal languages.

Proceedings ArticleDOI
16 Nov 1992
TL;DR: A survey of existing widely accepted FDTs is presented and a selection is made for an appropriate FDT to be used as the basis for future research work on FDT-based protocol converter.
Abstract: Computer networks are large and complex systems. The use of natural language to design and analyse communications protocols will lead to ambiguous representations. This inadequacy of the natural language gives rise to the many recent research works on formal description techniques (FDTs). FDTs are methods to define the behaviour of an information processing system in a language with formal syntax and semantics. FDT has been chosen as a basis for the design of protocol converter. A survey of existing widely accepted FDTs is presented. From this survey, a selection is made for an appropriate FDT to be used as the basis for future research work on FDT-based protocol converter. >

Journal Article
TL;DR: The problem proposed by Gauss of characterizing the code of a simple crossing closed curve (SCCC) can be considered a formal language question and three related infinite languages are defined.
Abstract: The problem proposed by Gauss of characterizing the code of a simple crossing closed curve (SCCC, for short) can be considered a formal language question. We define three related infinite languages. Two of them are regular; the type of the third is an open problem.

01 Jan 1992
TL;DR: An automated system called BOZ is described that begins with a logical problem representation and solution procedure, and generates an informationally-equivalent visual problem representations and procedure that allows the human user to obtain the solution more efficiently.
Abstract: Automated reasoning about the design of effective visual problem representations is possible when we adopt the view that visual problem representations, along with the pemeptual procedures that humans use to manipulate them, can be described using information-processing models of the sort introduced by Newell and Simon (1972). This approach provides us not only with a means of characterizing visual problem representations in a formal syntax but also with a means of automatically mapping between "logical" and "perceptual" problem representations and procedures. An automated system called BOZ is described that begins with a logical problem representation and solution procedure, and generates an informationally-equivalent visual problem representation and procedure that allows the human user to obtain the solution more efficiently. BOZ’s representation mapping technique proceeds by: (1) replacing demanding logical inferences in the solution procedure with efficient perceptual inferences; and (2) structuring information the visual representation such that search is minimized. The extent to which the visual representations and procedures produced by BOZ agree with what users actually see and do is discussed.

Journal ArticleDOI
TL;DR: An algorithm is presented that, when given a Chomsky Normal Form context-free grammar G, constructs a Chomsky normal form context- free grammar, G T, to derive strings that contain transposition errors, and allows for the detection, classification, and correction of transpositionerrors occurring in the syntactic description of patterns.