scispace - formally typeset
Search or ask a question

Showing papers on "Natural language published in 1981"


Book
01 Jan 1981
TL;DR: The Monitor Theory of adult second language acquisition as mentioned in this paper has been widely used in the field of language acquisition and has been applied to a wide range of domains, e.g., first and second language learning.
Abstract: All Rights Reserved. This publication may be downloaded and copied without charge for all reasonable, non-commercial educational purposes, provided no alterations in the text are made. I have had a great deal of help and feedback from many people in writing this book. Among the many scholars and friends I am indebted to are also would like to express my thanks to those scholars whose work has stimulated my own thinking in the early stages of the research reported on here: John Upshur, Leonard Newmark, and S. Pit Corder all recognized the reality of language "acquisition" in the adult long before I did. I would also like the thank Introduction This book is concerned with what has been called the "Monitor Theory" of adult second language acquisition. Monitor Theory hypothesizes that adults have two independent systems for developing ability in second languages, subconscious language acquisition and conscious language learning, and that these systems are interrelated in a definite way: subconscious acquisition appears to be far more important. The introduction is devoted to a brief statement of the theory and its implications for different aspects of second language acquisitions theory and practice. We define acquisition and learning, and present the Monitor Model for adult second language performance. Following this, brief summaries of research results in various areas of second language acquisition serve as both an overview of Monitor Theory research over the last few years and as introduction to the essays that follow. Language acquisition is very similar to the process children use in acquiring first and second languages. It requires meaningful interaction in the target language-natural communication-in which speakers are concerned not with the form of their utterances but with the messages they are conveying and understanding. Error correction and explicit teaching of rules are not relevant to language acquisition but caretakers and native speakers can modify their utterances addressed to acquirers to help them understand, and these modifications are thought to help the acquisition process (Snow and Ferguson, 1977). It has been hypothesized that there is a fairly stable order of acquisition of structures in language acquisition, that is, one can see clear 1 similarities across acquirers as to which structures tend to be acquired early and which tend to be acquired late (Brown, 1973; Dulay and Burt, 1975). Acquirers need not have a conscious awareness of the "rules" they possess, and may self-correct only on the basis of …

4,609 citations



Book
01 Jan 1981
TL;DR: A process-model embodying this theory has been implemented in a computer system, POLITICS as discussed by the authors, which models human ideological reasoning in understanding the natural language text of international political events.
Abstract: : Modeling human understanding of natural language requires a model of the processes underlying human thought. No two people think exactly alike; different people subscribe to different beliefs and are motivated by different goals in their activities. A theory of subjective understanding has been proposed to account for subjectively-motivated human thinking ranging from ideological belief to human discourse and personality traits. A process-model embodying this theory has been implemented in a computer system, POLITICS. POLITICS models human ideological reasoning in understanding the natural language text of international political events. POLITICS can model either liberal or conservative ideologies. Each ideology produces a different interpretation of the input event. POLITICS demonstrates its understanding by answering questions in natural language question-answer dialogs.

214 citations


Book
01 Jan 1981

213 citations


Journal ArticleDOI
Lance A. Miller1
TL;DR: The objective in this study was to obtain detailed empirical information about the nature of natural language “programming” to bring to bear on the issue of increasing the usability of computer language interfaces.
Abstract: Our objective in this study was to obtain detailed empirical information about the nature of natural language “programming” to bring to bear on the issueosf increasing the usability of computer language interfaces. Although we expected numerous difficulties to be detected concerning the potentioalf actually implementing a system to interpret natural language programs, we were not prepared for the magnitude of what we see as being the three major obstacles: style, semantics, and world knowledge. Concerning the first, there is little way in which the vast differences in styles could be increased: programming-language style is simply alien to natural specification. With respect to semantics, we also were unprepared to find out the extent to which the selection of the appropriate “meaning” (to a word, phrase, or sentence) is dependent upon the immediate and prior context. And as for world nowledge, we suspect that the extent to which shared experiences and knowledge are critical to procedural communication and understanding among people has barely been hintaetd by our present data.

192 citations


Proceedings ArticleDOI
29 Jun 1981
TL;DR: This paper describes initial work on a methodology for creating natural-language processing capabilities for new databases without the need for intervention by specially trained experts.
Abstract: : Several computer systems have now been constructed that allow users to access databases by posing questions in natural languages, such as English. When used in the restricted domains for which they have been especially designed, these systems have achieved reasonably high levels of performance. However, these systems require the encoding of knowledge about the domain of application in complex data structures that typically can be created for a new database only with considerable effort on the part of a computer professional who has had special training in computational linguistics and the use of databases. This paper describes initial work on a methodology for creating natural-language processing capabilities for new databases without the need for intervention by specially trained experts. The approach is to acquire logical schemata and lexical information through simple interactive dialogues with someone who is familiar with the form and content of the database, but unfamiliar with the technology of natural-language interfaces. A prototype system using this methodology is described and an example transcript is presented.

181 citations



DOI
15 May 1981
TL;DR: It is demonstrated, using protocols of actual interactions with a question-answering system, that users of these systems expect to engage in a conversation whose coherence is manifested in the interpendence of their (often unstated) plans and goals with those of the system.
Abstract: : We demonstrate, using protocols of actual interactions with a question-answering system, that users of these systems expect to engage in a conversation whose coherence is manifested in the interpendence of their (often unstated) plans and goals with those of the system. Since these problems are even more obvious in other forms of natural-language understanding systems, such as task-oriented dialogue systems, techniques for engaging in question-answering conversation should be special cases of general conversational abilities. We characterize dimensions along which language understanding systems might differ and, based partly on this analysis, propose a new system architecture, centered around recognizing the user's plans and planning helpful responses, which can be applied to a number of possible application areas. To illustrate progress to date, we discuss two implemented systems, one operating in a simple question-answering framework, and the other in a decision support framework for which both graphic and linguistic means of communication are available. (Author)

167 citations


Proceedings Article
04 Nov 1981
TL;DR: Research is being done on the development of a methodology for representing the information in texts and of procedures for relating the linguistic structure of a request to the corresponding representations.
Abstract: This paper describes research on the development of a methodology for representing the information in texts and of procedures for relating the linguistic structure of a request to the corresponding representations. The work is being done in the context of a prototype system that will allow physicians and other health professionals to access information in a computerized textbook of hepatitis through natural language dialogues. The interpretation of natural language queries is derived from DIAMOND/DIAGRAM, a linguistically motivated, domain-independent natural language interface developed at SRI. A text access component is being developed that uses representations of the propositional content of text passages and of the hierarchical structure of the text as a whole to retrieve relevant information.

165 citations



01 Oct 1981
TL;DR: The state of the art in practical computer systems for natural-language processing is described, discussing when language-processing technology at various levels of capability is likely to be commercially practical, and what it may cost to develop and use applications of that technology.
Abstract: : This paper describes the state of the art in practical computer systems for natural-language processing. We first consider why one would want to use natural language to communicate with computers at all, looking at both general issues and specific applications. Next we examine what it really means for a system to have a natural-language capability. This is followed by a discussion of some major limitations of current technology. The bulk of the paper is devoted to looking in detail at a single application of natural-language processing: database retrieval by natural-language query. We lay out an overall system architecture, explaining what types of processing and information are required. Then we look at two general classes of systems, special-purpose and general-purpose, explaining how they differ and their relative advantages and disadvantages. Afterwards we point out some remaining problems that will require additional basic research. Finally we conclude by discussing when language-processing technology at various levels of capability is likely to be commercially practical, and what it may cost to develop and use applications of that technology.

Book
01 Jan 1981
TL;DR: The overall problem is considered to be one of refining the specification of an illocutionary act into a surface syntactic form, emphasizing the problems of achieving multiple goals in a single utterance.
Abstract: This dissertation presents the results of research on a planning formalism for a theory of natural language generation that incorporates generation of utterances that satisfy multiple goals. Previous research in the area of computer generation of natural language utterances has concentrated on one of two aspects of language production: (1) the process of producing surface syntactic forms from an underlying representation, and (2) the planning of illocutionary acts to satisfy the speaker's goals. This work concentrates on the interaction between these two aspects of language generation and considers the overall problem to be one of refining the specification of an illocutionary act into a surface syntactic form, emphasizing the problems of achieving multiple goals in a single utterance. Planning utterances requires an ability to do detailed reasoning about what the hearer knows and wants. A formalism, based on a possible worlds semantics of an intensional logic of knowledge and action, was developed for representing the effects of illocutionary acts and the speaker's beliefs about the hearer's knowledge of the world. Techniques are described that enable a planning system to use the representation effectively. The language planning theory and knowledge representation are embodied in a computer system called KAMP (Knowledge And Modalities Planner) which plans both physical and linguistic actions, given a high level description of the speaker's goal. The research has application to the design of gracefully interacting computer systems, multiple-agent planning systems, and planning to acquire knowledge.

Journal ArticleDOI
TL;DR: In this paper, case marking and the nature of language are discussed in the context of Australian Journal of Linguistics, Vol. 1, No. 2, No 2, pp 227-244.
Abstract: (1981). Case marking and the nature of language. Australian Journal of Linguistics: Vol. 1, No. 2, pp. 227-244.

04 May 1981
TL;DR: The central thesis is that human inference processes are governed by the same analogical mappings manifest as metaphors in language, resulting in an induced invariance hierarchy.
Abstract: : Interpreting metaphors is an integral and inescapable process in human understanding of natural language. Part I of this paper discusses a method of analyzing metaphors based on the existence of a small number of generalized metaphor mappings. Each generalized metaphor contains a recognition network, a basic mapping, additional transfer mappings, and an implicit intention component. It is argued that this method reduces metaphor interpretation from a reconstruction to a recognition task. Steps towards automating certain aspects of language learning are also discussed. Part II analyzes analogical mappings underlying metaphors and implications for inference and memory organization. Regularities have been observed indicating that certain types of conceptual relations are much more apt to remain invariant in analogical mappings than other relations, resulting in an induced invariance hierarchy. The central thesis is that human inference processes are governed by the same analogical mappings manifest as metaphors in language. (Author)

Journal Article
TL;DR: This paper reports recent research into methods for creating natural language text using a new processing paradigm called Fragment-and-Compose and the computational methods of KDS, which embodies this paradigm.
Abstract: This paper reports recent research into methods for creating natural language text. A new processing paradigm called Fragment-and-Compose has been created and an experimental system implemented in it. The knowledge to be expressed in text is first divided into small propositional units, which are then composed into appropriate combinations and converted into text.KDS (Knowledge Delivery System), which embodies this paradigm, has distinct parts devoted to creation of the propositional units, to organization of the text, to prevention of excess redundancy, to creation of combinations of units, to evaluation of these combinations as potential sentences, to selection of the best among competing combinations, and to creation of the final text. The Fragment-and-Compose paradigm and the computational methods of KDS are described.

Proceedings ArticleDOI
29 Jun 1981
TL;DR: A multi-strategy approach is shown to provide a much higher degree of flexibility, redundancy, and ability to bring task-specific domain knowledge to bear on both grammatical and ungrammatical input.
Abstract: Robust natural language interpretation requires strong semantic domain models, "fail-soft" recovery heuristics, and very flexible control structures. Although single-strategy parsers have met with a measure of success, a multi-strategy approach is shown to provide a much higher degree of flexibility, redundancy, and ability to bring task-specific domain knowledge (in addition to general linguistic knowledge) to bear on both grammatical and ungrammatical input. A parsing algorithm is presented that integrates several different parsing strategies, with case-frame instantiation dominating. Each of these parsing strategies exploits different types of knowledge; and their combination provides a strong framework in which to process conjunctions, fragmentary input, and ungrammatical structures, as well as less exotic, grammatically correct input. Several specific heuristics for handling ungrammatical input are presented within this multi-strategy framework.

Journal ArticleDOI
TL;DR: This paper reported two experiments whose results showed that deaf children have a particular problem in understanding metaphorical uses of natural language, while hearing adults have a better understanding of metaphorical use of language.
Abstract: Researchers and educators of the deaf often suggest that deaf children have a particular problem in understanding metaphorical uses of natural language. This paper reports two experiments whose res...


Journal ArticleDOI
TL;DR: This article reviewed the major universalist theories of human language and provided the necessary framework for the data they discuss and also provided a characterization of the constraints on this variation is as much a part of a theory of human languages as is the characterization of commonly occurring patterns.
Abstract: The paradox of human language is that it is, at once, both fixed and free; universals of structure and process coexist with diversity and change. True universals are so deeply a part of language, so basic to the ways in which we think about language, that-like fish in water-we find it difficult to recognize them or, when we do, they seem obvious or trivial. Not surprisingly, progress has been slow toward the goal of characterizing human language. In phonology, exceptionless universals include linearity of units that are analyzable into hierarchical structures, rule-governed systems, redundancy, and variation in sounds by context, speaker, social setting, and over time. Easier to discover are the widespread tendencies of languages to pattern in certain ways. The ttrm “universal” will be used here, as it often is used in linguistics, to describe such cross-language patterns-ones that have a high probability of occurrence but are not without exception, such as Trubetzkoy’s typology of vowel systems, Ferguson’s universal states and normal tendencies for nasals, and Greenberg’s implicational universals for glottalized consonants.l In phonology, many such universals derive from universal properties of human articulatory and perceptual Yet the hallmark of human language is its range of variation, across particular languages and within the individual. A characterization of the constraints on this variation is as much a part of a theory of human language as is the characterization of the commonly occurring patterns. The same paradox exists for language development: it too is both fixed and free. The determination of true universals likewise is difficult, the statement of general patterns somewhat easier. And once again, the hallmark is variation, across languages and in the individual. Much research in child phonology is directed toward finding regularities in this variation. The formulation of a theory of acquisition lies in the distant future, and behind us lies a history of the sequential demise of elegant, yet ultimately too simple, acquisitional theories. We begin this paper by briefly reviewing the major universalist theories, both to provide the necessary framework for the data we discuss and also

Journal ArticleDOI
TL;DR: The general conclusion of the study is that increased experience leads to increased use of natural language features in classroom communication.
Abstract: This study examined the ability of teachers to produce grammatical manual representations of English under normal classroom conditions. Three groups of teachers—inexperienced signing hearing teachers, experienced signing hearing teachers, and deaf teachers—were observed three times using a live-observation system. Twenty-three teachers were observed with an inter-coder reliability of .896. Differences in teachers' ability to use separate signs for English grammatical endings and the use of ASL-like signing were found. The general conclusion of the study is that increased experience leads to increased use of natural language features in classroom communication.

01 Mar 1981
TL;DR: The proposed answer is that the notion of truth-conditions can be explicated and made precise by identifying them with a particular kind of abstract procedure and that such procedures can serve as the meaning bearing elements of a theory of semantics suitable for computer implementation.
Abstract: : This report addresses fundamental issues of semantics for computational systems. The question at issue is 'What is it that machines can have that would correspond to the knowledge of meanings that people have and that we seem to refer to by the ordinary language term 'meaning'?' The proposed answer is that the notion of truth-conditions can be explicated and made precise by identifying them with a particular kind of abstract procedure and that such procedures can serve as the meaning bearing elements of a theory of semantics suitable for computer implementation. This theory, referred to as 'procedural semantics', has been the basis of several successful computerized systems and is acquiring increasing interest among philosophers of language. (Author)

Journal ArticleDOI
TL;DR: The authors explored novel ways of using the media for education, especially second language instruction, and compared various combinations of visual and auditory presentations of messages, for example, both written script and spoken dialogue in subjects' first language (L1), or script and/or dialogue in a second laguage (L2), and so on.
Abstract: This study explored novel ways of using the media for education, especially second language instruction. Various combinations of visual and auditory presentations of messages were compared, for example, both written script and spoken dialogue in subjects’ first language (L1), or script and/or dialogue in a second laguage (L2), and so on. Subjects were elementary pupils with advanced training in L2. The dialogues of radio programs were transcribed, permitting such combinations as: dialogue in L2, script in L1 (the normal subtitling format); dialogue in L1, script in L2 (reversed subtitling); both dialogue and script in L1 or in L2; and so forth. On L1 and L2 tests of memory, certain combinations (e.g., reversed subtitling) were much more promising for the development or maintenance of second language skills, or for literacy training, than was conventional subtitling. Theoretical and practical inplications are discussed.

Journal Article
TL;DR: A logic-programmed analyser that translates Spanish into this system that equates semantic agreement with syntactic weil-formedness, and can detect certain presuppositions, resolve certain ambiguities and reflect relations among sets is presented.
Abstract: We discuss the use of logic for natural language (NL) processing, both as an internal query language and as a programming tool. Some extensions of standard predicate calculus are motivated by the first of these roles. A logical system including these extensions is informally described. It incorporates semantic as well as syntactic NL features, and its semantics in a given interpretation (or data base) determines the answer-extraction process. We also present a logic-programmed analyser that translates Spanish into this system. It equates semantic agreement with syntactic weil-formedness, and can detect certain presuppositions, resolve certain ambiguities and reflect relations among sets.

Journal Article
TL;DR: In this paper, a bottom-up pattern matching parser for restricted natural language input to a limited-domain computer system is described, and a set of parsing flexibilities that such a system should provide.
Abstract: When people use natural language in natural settings, they often use it ungrammatically, leaving out or repeating words, breaking off and restarting, speaking in fragments, etc. Their human listeners are usually able to cope with these deviations with little difficulty. If a computer system is to accept natural language input from its users on a routine basis, it should be similarly robust. In this paper, we outline a set of parsing flexibilities that such a system should provide. We go on to describe FlexP, a bottom-up pattern matching parser that we have designed and implemented to provide many of these flexibilities for restricted natural language input to a limited-domain computer system.

Journal ArticleDOI
TL;DR: The distinction between principles and theories has not been drawn as mentioned in this paper, and to date no sharp line distinguishing them has been drawn, although some long-standing ones have in fact been questioned.
Abstract: Science, as practiced in various disciplines such as physics, astronomy, biology, etc., uses the same method of arriving at trustworthy statements. The basic concepts used in science are referred to by the terms 'principle', 'theory', 'hypothesis', 'law' and 'datum'. My primary purpose here is to clarify the meaning of 'theory'. To do so, the other terms will be considered parenthetically at least inasmuch as they are all conceptually or logically interrelated. Inadequate attention has been given to the distinction between principles and theories, and to date no sharp line distinguishing them has been drawn. A principle should supersede and be more general than a theory in the sense that it should regulate and control the bounds of theory. Principles should come first in explanatory potency in that each should partially justify or make reasonable a number of theories. There should be two kinds of principles: those that justify the form of abstract reasoning or analytic reasoning such as mathematics and logic and those pertaining to concrete matters of fact or to synthetic reasoning. The first kind states what is logically possible and impossible, such as the principles of identity, contradiction and excluded middle (sometimes called laws). When applied to propositions these principles are: P D P; (P · P); P v P. A simüar set of principles applies to classes and particulars: a particular is identical with itself; no particular is itself and not itself: any particular, X, is either an a or a non-a. Principles can be neither proved nor disproved, but they are used as a basis for proving other statements and for formulating theories and laws. Traditionally principles have often been considered to be self-evident, directly intuited truths if not innate ideas. Experimental science has taught us to distrust the necessity or absoluteness of principles in that some long-standing ones have in fact been questioned. Although it is with great reluctance that old principles are discarded, from time to time new principle are formulated or postulated or discovered, some in opposition to old ones. Archimedes 'discovered' the principle of the lever. To date it has not been

Journal ArticleDOI
08 May 1981-Science
TL;DR: The data indicate that the modification of natural perceptual categories after language acquisition is not bound to a particular transmission modality, but rather can be a more general consequence of acquiring a formal linguistic system.
Abstract: Hearing subjects unfamiliar with American Sign Language and deaf native signers made triadic comparisons of movements of the hands and arms isolated from American Sign Language. Clustering and scaling of subjects' judgments revealed different psychological representations of movement form for deaf and hearing observers. Linguistically relevant dimensions acquired modified salience for users of a visual-gestural language. The data indicate that the modification of natural perceptual categories after language acquisition is not bound to a particular transmission modality, but rather can be a more general consequence of acquiring a formal linguistic system.

Journal ArticleDOI
TL;DR: The present tutorial describes an approach to interpreting a ‘3-view’ drawing for the construction of its ‘ 3-D’ representation.