scispace - formally typeset
Search or ask a question
Book

语义学引论 = Linguistic Semantics

01 Jan 2000-
About: The article was published on 2000-01-01 and is currently open access. It has received 258 citations till now.
Citations
More filters
Journal ArticleDOI
TL;DR: A method for measuring the semantic similarity of texts using a corpus-based measure of semantic word similarity and a normalized and modified version of the Longest Common Subsequence string matching algorithm is presented.
Abstract: We present a method for measuring the semantic similarity of texts using a corpus-based measure of semantic word similarity and a normalized and modified version of the Longest Common Subsequence (LCS) string matching algorithm. Existing methods for computing text similarity have focused mainly on either large documents or individual words. We focus on computing the similarity between two sentences or two short paragraphs. The proposed method can be exploited in a variety of applications involving textual knowledge representation and knowledge discovery. Evaluation results on two different data sets show that our method outperforms several competing methods.

519 citations

Book
01 Jan 2004
TL;DR: This book is divided into two parts: a philosophical part I and a practical part II, in which the authors present their text-meaning representation (TMR) and demonstrate how it is used in language analysis and critique many alternative views of semantics.
Abstract: In this book, Nirenburg and Raskin present an important body of work in computational linguistics that they and their colleagues have been developing over the past 20 years. For a unifying perspective, they organize their assumptions, theories, and techniques around the theme of ontological semantics. Along the way, they critique many alternative views of semantics, which they distinguish from their own. Their analyses contribute to a much-needed debate about the history and future of computational linguistics, but to preserve some balance, teachers and students should keep a few of the alternatives on their reference shelf. The book is divided into two parts: a philosophical part I and a practical part II. The first part consists of an introductory chapter 1 and four chapters that survey important but controversial issues about linguistics, both theoretical and computational. In those chapters, the authors make a good case for their version of ontological semantics, but the alternatives are not treated in detail. In part II, the authors present their text-meaning representation (TMR) and demonstrate how it is used in language analysis. Any discussion of technical material must use some notation, and TMR is sufficiently flexible to illustrate a wide range of semantic-based methods that could be adapted to many other formalisms. For most readers, part II would be the more important. Chapter 1 is a good 25-page overview of computational linguistics with an emphasis on semantics. Students and novices, however, need examples, and none are given until chapter 6. The authors suggest that ‘‘a well-prepared and/or uninterested reader’’ skip the remainder of part I and go straight to chapter 6, which begins with an excellent five-page example. The authors follow that advice when they teach courses from this text. In Chapter 2, the authors present their ‘‘Prolegomena to the Philosophy of Linguistics.’’ Their ideas are well taken, and some are as old as Socrates: Examine the assumptions, challenge conventional wisdom, and test conclusions against experience. The basis of their approach is what they call the four components of a scientific theory:

408 citations


Cites background from "语义学引论 = Linguistic Semantics"

  • ...Practically any book or article on formal semantics has been devoted to a subset of this inventory (see Montague 1974; Dowty 1979; Dowty et al. 1981; Partee 1973, 1976; Hornstein 1984; Bach 1989; Chierchia and McConnel-Ginet 1990; Frawley 1992; Cann 1993; Chierchia 1995; Heim and Kratzer 1998)....

    [...]

Journal ArticleDOI
TL;DR: The authors developed a situative approach to explain the transfer of learning, illustrating it using a challenging-to-explain case from a Fostering Communities of Learners classroom, which involved a group of 5th graders who learned and then transferred a more sophisticated way of explaining species survival and endangerment despite participating in a unit in which many activities designed to foster transfer were short circuited.
Abstract: This article develops a situative approach to explaining the transfer of learning, illustrating it using a challenging-to-explain case from a Fostering Communities of Learners classroom. The case involved a group of 5th graders who learned and then transferred a more sophisticated way of explaining species survival and endangerment despite participating in a unit in which many activities designed to foster transfer were short circuited. The explanation included 2 kinds of analyses: (a) how students participated in the learning of relevant content, and (b) how learning contexts were framed interactionally. The content analysis took a situative perspective on commonly investigated transfer mechanisms such as quality of initial learning, engagement with multiple examples, comparison between examples, and formation of generalizations. There was strong evidence for a few of these mechanisms and weak or inconclusive evidence for others. The context analysis explored 2 aspects of the relatively new hypothesis th...

363 citations


Cites background from "语义学引论 = Linguistic Semantics"

  • ...In addition, when learning and transfer contexts are framed interactionally as ongoing activities rather than as temporally bounded events (see Frawley, 1992; and C. Goodwin, 2002, respectively, for linguistic and gestural examples), the chances increase that both are viewed as part of the same…...

    [...]

Journal ArticleDOI
TL;DR: This article presents a possible solution to the mind—brain correspondence problem by suggesting that complex psychological states such as emotion and cognition can be thought of as constructed events that can be causally reduced to a set of more basic, psychologically primitive ingredients that are more clearly respected by the brain.
Abstract: Psychological states such as thoughts and feelings are real. Brain states are real. The problem is that the two are not real in the same way, creating the mind-brain correspondence problem. In this article, I present a possible solution to this problem that involves two suggestions. First, complex psychological states such as emotion and cognition can be thought of as constructed events that can be causally reduced to a set of more basic, psychologically primitive ingredients that are more clearly respected by the brain. Second, complex psychological categories like emotion and cognition are the phenomena that require explanation in psychology, and, therefore, they cannot be abandoned by science. Describing the content and structure of these categories is a necessary and valuable scientific activity.

293 citations


Cites background from "语义学引论 = Linguistic Semantics"

  • ...3A nominal kind is a category, denoted by a word, that is a combination of more fundamental properties (Frawley, 1992)....

    [...]

  • ...(3)A nominal kind is a category, denoted by a word, that is a combination of more fundamental properties (Frawley, 1992)....

    [...]

Journal ArticleDOI
TL;DR: In simulations, DevLex develops topographically organized representations for linguistic categories over time, models lexical confusion as a function of word density and semantic similarity, and shows age-of-acquisition effects in the course of learning a growing lexicon.

223 citations


Cites methods from "语义学引论 = Linguistic Semantics"

  • ...Harm extracted relevant semantic features fromWordNet for nouns and verbs, but for adjectives he hand-coded the semantic features according to a taxonomy of features given by Frawley (1992)....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: A method for measuring the semantic similarity of texts using a corpus-based measure of semantic word similarity and a normalized and modified version of the Longest Common Subsequence string matching algorithm is presented.
Abstract: We present a method for measuring the semantic similarity of texts using a corpus-based measure of semantic word similarity and a normalized and modified version of the Longest Common Subsequence (LCS) string matching algorithm. Existing methods for computing text similarity have focused mainly on either large documents or individual words. We focus on computing the similarity between two sentences or two short paragraphs. The proposed method can be exploited in a variety of applications involving textual knowledge representation and knowledge discovery. Evaluation results on two different data sets show that our method outperforms several competing methods.

519 citations

Book
01 Jan 2004
TL;DR: This book is divided into two parts: a philosophical part I and a practical part II, in which the authors present their text-meaning representation (TMR) and demonstrate how it is used in language analysis and critique many alternative views of semantics.
Abstract: In this book, Nirenburg and Raskin present an important body of work in computational linguistics that they and their colleagues have been developing over the past 20 years. For a unifying perspective, they organize their assumptions, theories, and techniques around the theme of ontological semantics. Along the way, they critique many alternative views of semantics, which they distinguish from their own. Their analyses contribute to a much-needed debate about the history and future of computational linguistics, but to preserve some balance, teachers and students should keep a few of the alternatives on their reference shelf. The book is divided into two parts: a philosophical part I and a practical part II. The first part consists of an introductory chapter 1 and four chapters that survey important but controversial issues about linguistics, both theoretical and computational. In those chapters, the authors make a good case for their version of ontological semantics, but the alternatives are not treated in detail. In part II, the authors present their text-meaning representation (TMR) and demonstrate how it is used in language analysis. Any discussion of technical material must use some notation, and TMR is sufficiently flexible to illustrate a wide range of semantic-based methods that could be adapted to many other formalisms. For most readers, part II would be the more important. Chapter 1 is a good 25-page overview of computational linguistics with an emphasis on semantics. Students and novices, however, need examples, and none are given until chapter 6. The authors suggest that ‘‘a well-prepared and/or uninterested reader’’ skip the remainder of part I and go straight to chapter 6, which begins with an excellent five-page example. The authors follow that advice when they teach courses from this text. In Chapter 2, the authors present their ‘‘Prolegomena to the Philosophy of Linguistics.’’ Their ideas are well taken, and some are as old as Socrates: Examine the assumptions, challenge conventional wisdom, and test conclusions against experience. The basis of their approach is what they call the four components of a scientific theory:

408 citations

Journal ArticleDOI
TL;DR: The authors developed a situative approach to explain the transfer of learning, illustrating it using a challenging-to-explain case from a Fostering Communities of Learners classroom, which involved a group of 5th graders who learned and then transferred a more sophisticated way of explaining species survival and endangerment despite participating in a unit in which many activities designed to foster transfer were short circuited.
Abstract: This article develops a situative approach to explaining the transfer of learning, illustrating it using a challenging-to-explain case from a Fostering Communities of Learners classroom. The case involved a group of 5th graders who learned and then transferred a more sophisticated way of explaining species survival and endangerment despite participating in a unit in which many activities designed to foster transfer were short circuited. The explanation included 2 kinds of analyses: (a) how students participated in the learning of relevant content, and (b) how learning contexts were framed interactionally. The content analysis took a situative perspective on commonly investigated transfer mechanisms such as quality of initial learning, engagement with multiple examples, comparison between examples, and formation of generalizations. There was strong evidence for a few of these mechanisms and weak or inconclusive evidence for others. The context analysis explored 2 aspects of the relatively new hypothesis th...

363 citations

Journal ArticleDOI
TL;DR: This article presents a possible solution to the mind—brain correspondence problem by suggesting that complex psychological states such as emotion and cognition can be thought of as constructed events that can be causally reduced to a set of more basic, psychologically primitive ingredients that are more clearly respected by the brain.
Abstract: Psychological states such as thoughts and feelings are real. Brain states are real. The problem is that the two are not real in the same way, creating the mind-brain correspondence problem. In this article, I present a possible solution to this problem that involves two suggestions. First, complex psychological states such as emotion and cognition can be thought of as constructed events that can be causally reduced to a set of more basic, psychologically primitive ingredients that are more clearly respected by the brain. Second, complex psychological categories like emotion and cognition are the phenomena that require explanation in psychology, and, therefore, they cannot be abandoned by science. Describing the content and structure of these categories is a necessary and valuable scientific activity.

293 citations

Journal ArticleDOI
TL;DR: In simulations, DevLex develops topographically organized representations for linguistic categories over time, models lexical confusion as a function of word density and semantic similarity, and shows age-of-acquisition effects in the course of learning a growing lexicon.

223 citations