scispace - formally typeset
Search or ask a question
Topic

Antecedent (grammar)

About: Antecedent (grammar) is a research topic. Over the lifetime, 1392 publications have been published within this topic receiving 41824 citations.


Papers
More filters
Book Chapter
01 Oct 2017
TL;DR: The interaction between preposition stranding, case morphology, and sluicing in the Bulgarian of – at least – some speakers strongly supports an account of sluice under which (a) the intuitively missing part of the question is syntactically represented, and (b) the missing parts of the sentence is elided under semantic rather than syntactic identity with the antecedent.
Abstract: The paper shows that the interaction between preposition stranding, case morphology, and sluicing in the Bulgarian of – at least – some speakers strongly supports an account of sluicing under which (a) the intuitively missing part of the question is syntactically represented, (b) the missing part of the sentence is elided under semantic rather than syntactic identity with the antecedent, and (c) the pronounced wh-phrase has to fit in a specific sense discussed in the paper into the antecedent. Assumptions (a) and (b) constitute Merchant’s approach to sluicing. As pointed out by Lasnik, (a) and (b) by themselves do not derive important, well-established, central properties of sluicing. Assumption (c) is intended to fix this gap in Merchant’s account. The conjunction of (a), (b), and (c) makes novel predictions not shared by competing accounts of sluicing like a Lasnik-style syntactic identity account or a Culicover-and-Jackendoff-style account with no syntax at the ellipsis site. The Bulgarian data presented here show that the specific expectations of the present account are borne out, giving it the empirical edge over its competitors.

12 citations

Journal ArticleDOI
TL;DR: A critical examination of Dik’s account of discourse anaphora is shown, within the framework of the theory of Functional Grammar, that his description of the underlying structure of anaphors, which includes both the referential index of their actual referent/antecedent and a variable specifying the latter's entity-order, does not allow for the necessary flexibility and dynamic character of anphor use and interpretation.
Abstract: This article is a critical examination of Dik’s (1997b: ch. 10) account of discourse anaphora, within the framework of the theory of Functional Grammar (but it highlights features of anaphora theory which hold more generally). I show first that Dik’s definitions of the phenomenon involve two contradictory conceptions of this discourse procedure (the anaphor refers to a mental representation of its referent within a mental model of the ongoing discourse, yet at the same time needs first to connect up with a segment of co-text - its linguistic antecedent); second, that Dik’s account of the relationship between given (pronominal) anaphor types and the “entity-order” of their potential referents is both too rigid and too narrow; and third, that his description of the underlying structure of anaphors, which includes both the referential index of their actual referent/antecedent and a variable specifying the latter’s entity-order, does not allow for the necessary flexibility and dynamic character of anaphor use and interpretation. A discursively more realistic account of discourse anaphora needs to specify the necessary interaction between ‘bottom-up’ factors of these kinds, on the one hand, and ‘top-down’ relationships involving the wider discourse context, on the other. This is what I briefly outline at the end of the article.

12 citations

Posted Content
TL;DR: In this paper, a reinforcement learning agent is used to learn the policy of selecting antecedents in a sequential manner, where useful information provided by earlier predicted antecedences could be utilized for making later coreference decisions.
Abstract: Deep neural network models for Chinese zero pronoun resolution learn semantic information for zero pronoun and candidate antecedents, but tend to be short-sighted---they often make local decisions. They typically predict coreference chains between the zero pronoun and one single candidate antecedent one link at a time, while overlooking their long-term influence on future decisions. Ideally, modeling useful information of preceding potential antecedents is critical when later predicting zero pronoun-candidate antecedent pairs. In this study, we show how to integrate local and global decision-making by exploiting deep reinforcement learning models. With the help of the reinforcement learning agent, our model learns the policy of selecting antecedents in a sequential manner, where useful information provided by earlier predicted antecedents could be utilized for making later coreference decisions. Experimental results on OntoNotes 5.0 dataset show that our technique surpasses the state-of-the-art models.

12 citations

01 Nov 1989
TL;DR: In this paper, a two-stage model of sentence comprehension for building a computer model of language has been proposed, in which in the first stage, an intermediate level of representation called logical form is derived, and in the second stage, the representation is updated with additional information (e.g., quantifier scope).
Abstract: Several researchers in artificial intelligence have recognized the usefulness of a two-stage model of sentence comprehension for building a computer model of language. In the first stage, an intermediate level of representation called logical form is derived. During the second stage, logical form is updated with additional information (e.g., quantifier scoping). We introduce three constraints we consider necessary to make this model of language computationally feasible: \begin{enumerate} \item Logical form should compactly represent ambiguity. \item Logical form should be initially computable from syntax and local (sentence-level) semantics. In particular, logical form should not be dependent on pragmatics, which requires inference and hence internal representation. \item Further processing of logical form should only disambiguate or further specify logical form. Logical form has a meaning. Any further processing must respect that meaning. \end{enumerate} Within this framework, we have devised logical-form representations for pronouns, singular definite noun phrases, and singular indefinite noun phrases. For example, we represent a pronoun as a function of all of the variables corresponding to operators that can bind the pronoun. This representation allows us to indicate a meaning for the pronoun without deciding on the antecedent for the pronoun. Later, when we can determine the antecedent for the pronoun, we replace the pronoun function with the variable or function used to represent its antecedent. Like pronouns, definites are represented as functions. However, indefinites cannot initially be represented as a function in logical form. Initially, we represent an indefinite as an existentially quantified variable. Later, when more information is available about the meaning of a noun phrase, the initial representation is limited to indicate the intended meaning of that noun phrase. We demonstrate that these representations both model the appropriate linguistic behavior and satisfy our computational constraints. This work has been implemented and tested on a wide variety of examples.

12 citations

Book ChapterDOI
21 Sep 2009
TL;DR: This paper studies the general meaning of the additive particle too and argues that besides its well-known presuppositional content, too also conveys an information regarding the similarity of its host and the antecedent of its presupposition in the discourse.
Abstract: This paper studies the general meaning of the additive particle too. It is argued that besides its well-known presuppositional content, too also conveys an information regarding the similarity of its host and the antecedent of its presupposition in the discourse. We couch our proposal in an argumentative framework. This proposal is then articulated with recent accounts of the obligatoriness of too.

12 citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
202159
202052
201957
201863
201762