About: Regular grammar is a research topic. Over the lifetime, 1463 publications have been published within this topic receiving 33501 citations.
Papers published on a yearly basis
01 Dec 1996
TL;DR: For linguistics, the promise has been to situate language relative to the social/interactional matrix in which it is to be understood as inescapably as it is relative to a mind/brain this article.
Abstract: Introduction From early in its development, conversation-analytic work on interaction has declined to accord language any principled primacy as an object of inquiry (e.g., Schegloff and Sacks, 1973: 290). Although not derived from them, this view was in accord with the stances of such intellectual forbears as Garfinkel's ethnomethodology (1967) and Goffman's several approaches to interaction. It may be recalled, for example, that in “The neglected situation” Goffman (1964) injected into the “coming-out party” of the embryonic subfield known as the ethnography of speaking or communication the observation that speaking occurs most proximately in “situations,” in which it need not occur; speaking, then, had to be understood by reference to exigencies of contexts not designed for speaking in particular (as elaborated, for example, in the earlier Goffman, 1961, 1963, and the later Goffman, 1971). In both of these modalities of work, and in their predecessors, language was not a privileged object of inquiry, however interesting an object of inquiry it might be. Still, the accessibility of conversation (and talk-in-interaction more generally) to systematic inquiry has brought with it a need to explore the mutual bearing of the various organizations of “language” on the one hand (whatever that notion might turn out to refer to; cf. Schegloff, 1979: 282) and the organizations of interaction and talking-in-interaction on the other. For linguistics, the promise has been to situate language relative to the social/interactional matrix in which it is to be understood as inescapably as it is relative to the organization of the mind/brain.
TL;DR: This article developed a formal grammatical system called a link grammar and showed how English grammar can be encoded in such a system, and gave algorithms for efficiently parsing with a link grammars.
Abstract: We develop a formal grammatical system called a link grammar, show how English grammar can be encoded in such a system, and give algorithms for efficiently parsing with a link grammar. Although the expressive power of link grammars is equivalent to that of context free grammars, encoding natural language grammars appears to be much easier with the new system. We have written a program for general link parsing and written a link grammar for the English language. The performance of this preliminary system -- both in the breadth of English phenomena that it captures and in the computational resources used -- indicates that the approach may have practical uses as well as linguistic significance. Our program is written in C and may be obtained through the internet.
TL;DR: Two applications in speech recognition of the use of stochastic context-free grammars trained automatically via the Inside-Outside Algorithm, used to model VQ encoded speech for isolated word recognition and compared directly to HMMs used for the same task are described.
Abstract: This paper describes two applications in speech recognition of the use of stochastic context-free grammars (SCFGs) trained automatically via the Inside-Outside Algorithm. First, SCFGs are used to model VQ encoded speech for isolated word recognition and are compared directly to HMMs used for the same task. It is shown that SCFGs can model this low-level VQ data accurately and that a regular grammar based pre-training algorithm is effective both for reducing training time and obtaining robust solutions. Second, an SCFG is inferred from a transcription of the speech used to train a phoneme-based recognizer in an attempt to model phonotactic constraints. When used as a language model, this SCFG gives improved performance over a comparable regular grammar or bigram.
01 Jan 2004
TL;DR: This work has adopted the Competence Hypothesis as a methodological principle, and assumes that an explana tory model of human language performance will incorporate a theoreti cally justi ed representation of the native speaker s linguistic knowledge a grammar as a component separate both from the computational mech anisms that operate on it a processor and from other nongrammatical processing parameters that might in uence the processor s behavior.
Abstract: In learning their native language children develop a remarkable set of capabilities They acquire knowledge and skills that enable them to pro duce and comprehend an inde nite number of novel utterances and to make quite subtle judgments about certain of their properties The ma jor goal of psycholinguistic research is to devise an explanatory account of the mental operations that underlie these linguistic abilities In pursuing this goal we have adopted what we call the Competence Hypothesis as a methodological principle We assume that an explana tory model of human language performance will incorporate a theoreti cally justi ed representation of the native speaker s linguistic knowledge a grammar as a component separate both from the computational mech anisms that operate on it a processor and from other nongrammatical processing parameters that might in uence the processor s behavior To a certain extent the various components that we postulate can be studied independently guided where appropriate by the well established methods and evaluation standards of linguistics computer science and experimen tal psychology However the requirement that the various components ultimately must t together in a consistent and coherent model imposes even stronger constraints on their structure and operation