scispace - formally typeset
Search or ask a question

Showing papers in "Linguistics and Philosophy in 1994"


Journal ArticleDOI
Maria Bittner1
TL;DR: A universal system for type-driven translation is proposed, by adding two more innovations: local type determination for gaps and a set of semantic filters, dubbed Cross-Linguistic Semantics (XLS).
Abstract: Rooth & Partee (1982) and Rooth (1985) have shown that the English-specific rule-by-rule system of PTQ can be factores out into function application plus two transformations for resolving type mismatch (type lifting and variable binding). Building on these insights, this article proposes a universal system for type-driven translation, by adding two more innovations: local type determination for gaps (generalizing Montague 1973) and a set of semantic filters (extending Cooper 1983). This system, dubbed Cross-Linguistic Semantics (XLS), is shown to account for various phenomena — including scope relations in English and Greenlandic Eskimo, internally headed relative clauses in Lakhota, serial verbs in Yoruba and VP ellipsis in English.

153 citations


Journal ArticleDOI
Wlodek Zadrozny1
TL;DR: It is shown that when compositional semantics is required to be “systematic” (that is, the meaning function cannot be arbitrary, but must belong to some class), it is possible to distinguish between compositional and noncompositional semantics.
Abstract: We prove a theorem stating that any semantics can be encoded as a compositional semanties, which means that, essentially, the standard definition of compositionality is formally vacuous. We then show that when compositional semantics is required to be “systematic” (that is, the meaning function cannot be arbitrary, but must belong to some class), it is possible to distinguish between compositional and noncompositional semantics. As a result, we believe that the paper clarifies the concept of compositionality and opens the possibility of making systematic formal comparisons of different systems of grammar.

117 citations


Journal ArticleDOI
TL;DR: This chapter discusses the dimensions of phonology, syntax, and semantics/pragmatics, which give rise to a dimension-specific problem of compositionality.
Abstract: Through language, we are able to assign symbolic analyses to linguistic entities physical objects and events whose complexity has no intrinsic upper bound. Such symbolic analyses are abstract, since a single physical entity can support distinct analyses. Yet we have partial intuitive access to the properties of these analyses through their projections in different 'dimensions', including the widely recognized and studied dimensions of phonology, syntax, and semantics/pragmatics. Each of these dimensions gives rise to a dimension-specific problem of compositionality:

94 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that the availability of what some authors have called the weak reading and the strong reading of donkey sentences with relative clauses is systematically related to monotonicity properties of the determiner.
Abstract: In this paper, I show that the availability of what some authors have called the weak reading and the strong reading of donkey sentences with relative clauses is systematically related to monotonicity properties of the determiner. The correlation is different from what has been observed in the literature in that it concerns not only right monotonicity, but also left monotonicity (persistence/antipersistence). I claim that the reading selected by a donkey sentence with a double monotone determiner is in fact the one that validates inference based on the left monotonicity of the determiner. This accounts for the lack of strong reading in donkey sentences with ↑ MON → determiners, which have been neglected in the literature. I consider the relevance of other natural forms of inference as well, but also suggest how monotonicity inference might play a central role in the actual process of interpretation. The formal theory is couched in dynamic predicate logic with generalized quantifiers.

88 citations



Journal ArticleDOI
TL;DR: This paper proposed a discourse grammar which characterizes discourse cohesion by means of a syntactic/semantic matching procedure which recognizes parallel structures in discourse, which yields the resolution of verb phrase anaphora as a side effect.
Abstract: We argue that an adequate treatment of verb phrase anaphora (VPA) must depart in two major respects from the standard approaches. First of all, VP anaphors cannot be resolved by simply identifying the anaphoric VP with an antecedent VP. The resolution process must establish a syntactic/semantic parallelism between larger units (clauses or discourse constituent units) that the VPs occur in. Secondly, discourse structure has a significant influence on the reference possibilities of VPA. This influence must be accounted for. We propose a treatment which meets these requirements. It builds on a discourse grammar which characterizes discourse cohesion by means of a syntactic/semantic matching procedure which recognizes parallel structures in discourse. It turns out that this independently motivated procedure yields the resolution of VPA as a side effect.

67 citations


Journal ArticleDOI
TL;DR: The paper concentrates on sentence as opposed to text processing, and shows how dynamics can lead to novel parsing algorithms and to new possibilities for the formal description of syntactic constructions, in particular, for the description of non-constituent coordination.
Abstract: 1. INTRODUCTION Dynamics is the formal study of systems involving states and transitions between states. A natural application of dynamics is to the study of language processing, where words or morphemes can be thought of as actions which perform transitions between states of the language processor. The paper concentrates on sentence as opposed to text processing, and shows how dynamics can lead to novel parsing algorithms and to new possibilities for the formal description of syntactic constructions, in particular, for the description of non-constituent coordination. The use of dynamics in algorithm development will be illustrated by presenting a particular parsing algorithm which was developed for a 'core' lexicalised grammar, Lexicalised Dependency Grammar (Milward 1992) which resembles both simplified HPSG (Pollard and Sag 1993) and dependency grammar as formalised by Gaifman (see Hays 1964). The algorithm is fully incremental, providing a semantic representation word by word. It also exhibits reduced non-determinism relative to other almost-incremental algorithms 1 by using types instead of partial parse trees. These can be packed further using graph structuring (cf. Tomita 1985). Although dynamics specifies the states of a process and the possible mappings between states, it does not specify the control strategy (how the state space is traversed). Suitable languages for dynamics are thus both formal and declarative, and can be used to express linguistic generalisations. The final part of the paper regards the dynamics provided for the core lexicalised grammar as a grammar in its own right, Dynamic

62 citations


Book ChapterDOI
TL;DR: In this paper, the authors discuss some general questions involving mass terms in English and quantificational mass NPs like most gold and little water, and discuss the relation between them.
Abstract: In this article I discuss some general questions involving mass terms in English and quantificational mass NPs like most gold and little water.

54 citations





Journal ArticleDOI
TL;DR: Evidence is presented that the currently accepted approaches to conditionals are basically wrong about the semantic forms they attribute toif P, Q and that a particular interpretation of this theory is needed to account for only if and a type of metalinguistic negation ofQ if P.
Abstract: A comprehensive theory ofeven if needs to account for consequent ‘entailing’even ifs and in particular those of theif-focused variety. This is where the theory ofeven if ceases to be neutral between conditional theories. I have argued thatif-focusedeven ifs,especially if andonly if can only be accounted for through the suppositional theory ofif. Furthermore, a particular interpretation of this theory — the conditional assertion theory — is needed to account foronly if and a type of metalinguistic negation ofQ if P. We therefore have evidence that the currently accepted approaches to conditionals are basically wrong about the semantic forms they attribute toif P, Q.11

Journal ArticleDOI
TL;DR: The facts about such, then, indicate not just that such is a pro-adjective, but also that binding conditions apply broadly to pro-ADJs and pro-CNs, as well as to a wide range of pro-arguments.
Abstract: The facts aboutsuch, then, indicate not just thatsuch is a pro-adjective, but also that binding conditions apply broadly to pro-ADJs and pro-CNs, as well as to a wide range of pro-arguments. If this is true, the CN binding process accomplished by rules (40) and (41) might better be expressed in a system that uses a Cooper (1979) store mechanism. In fact, Stump (p. 144) notes that this could easily be done. Meanings of the type of∨ P n could be stored, just as NP meanings are, until an appropriate binding CN phrase was encountered. Binding conditions would simply require that a∨ P n meaning not come out of storage until the derivation had emerged from its governing category. The behavior of the pro-adjectivesuch suggests that an expression of any category, if it is legitimately translatable as a variable, may be a fullfledged proform; many principles and mechanisms described to account for the widely studied pronouns in fact apply to nonargument categories.

Journal ArticleDOI
TL;DR: In this article, the simple act proposal regarding the "by"-locution was proposed, which is a clear view of how this vigorous and powerful form of speech works, and it has commonly been thought that the by-statement reports that a certain relation obtains between these acts.
Abstract: In this paper I shall try to give a clear view of how this vigorous and powerful form of speech works. Most past work on this question has given primacy to the act concept. When she signals by waving her arm, she gives a signal (an act) and she gives or produces an arm-wave (also an act), and it has commonly been thought that the "by"-statement reports that a certain relation obtains between these acts.' I shall call this "the simple act proposal" regarding the "by"-locution. It says:

Journal ArticleDOI
TL;DR: The use of evolving algebra methods of specifying grammars for natural languages through distributed evolving algebras is considered, and a reconstruction of some classic grammar formalisms in directly dynamic terms is given.
Abstract: We consider the use ofevolving algebra methods of specifying grammars for natural languages. We are especially interested in distributed evolving algebras. We provide the motivation for doing this, and we give a reconstruction of some classic grammar formalisms in directly dynamic terms. Finally, we consider some technical questions arising from the use of direct dynamism in grammar formalisms.

Journal ArticleDOI
Mark Johnson1
TL;DR: This paper focuses on two widely-used "formal" or "logical" representations of gram mars in computational linguistics, Definite Clause Grammars and Feature Structure Grammar, and describes the way in which they express the recognition problem and the parsing problem.
Abstract: A grammar is a formal device which both identifies a certain set of utter ances as well-formed, and which also defines a transduction relation be tween these utterances and their linguistic representations. This paper focuses on two widely-used "formal" or "logical" representations of gram mars in computational linguistics, Definite Clause Grammars and Feature Structure Grammars, and describes the way in which they express the recognition problem (the problem of determining if an utterance is in the language generated by a grammar) and the parsing problem (the problem of finding the analyses assigned by a grammar to an utterance). Although both approaches are 'constraint-based', one of them is based on logical consequence relation, and the other is based on satisfiability. The main goal of this paper is to point out the different conceptual basis of these two ways of formalizing grammars, and discuss some of their properties.


Journal ArticleDOI
TL;DR: The authors discuss a version of supposition theory that aims at producing analyses of sentences containing quantified terms, as articulated around 1400 by Paul of Venice, and as further developed by certain logicians such as de Soto and Celaya in the 1400's and early 1500's.
Abstract: This paper arose from an attempt to determine how the very late medieval 1 supposition theorists treated anaphoric pronouns, pronouns whose significance is derivative from their antecedents. Modern researches into pronouns were stimulated in part by the problem of "donkey sentences" discussed by Geach (1962) in a section explaining what is wrong with medieval supposition theory. So there is some interest in seeing exactly what the medieval account comes to, especially if it turns out, as I suspect, to work as well as contemporary ones. Besides, finding a good analysis of pronouns has proved to be very difficult, and so we might possibly find some insight in a historically different kind of approach. I discuss a version of supposition theory that aims at producing analyses of sentences containing quantified terms, 2 as articulated around 1400 by Paul of Venice, and as further developed by certain logicians such as de Soto and Celaya in the 1400's and early 1500's. 3 Much of what I will say also applies indirectly to earlier versions of supposition theory (before

Journal ArticleDOI
TL;DR: In this article, it was shown that the set of feature structures possibly containing set values satisfies the weaker condition of forming a 2/3 SFP domain when equipped with an appropriate notion of subsumption: that is, for any finite setS of feature structure, there is a finite setM of minimal upper bounds ofS such that any upper bound ofS is approximated by a member ofM.
Abstract: It is well-known that feature structures (Rounds and Kasper 1986) can be fruitfully viewed as forming a Scott domain (Moshier 1988). Once a linguistically motivated notion of “set value” in feature structures is countenanced, however, this is no longer possible inasmuch as unification of set values in general fails to yield a unique result. In Pollard and Moshier 1990 it was shown that, while falling short of forming a Scott domain, the set of feature structures possibly containing set values satisfies the weaker condition of forming a “2/3 SFP domain” when equipped with an appropriate notion of subsumption: that is, for any finite setS of feature structures, there is a finite setM of minimal upper bounds ofS such that any upper bound ofS is approximated by a member ofM. Unfortunately, the 2/3 SFP domains are not as pleasant to work with as Scott domains since they are not closed under all the familiar domain constructions; and the question has remained open whether the feature structure domain satisfies the added condition of profiniteness. (The profinite ω-algebraic domains with least elements are a subclass of the 2/3 SFP domains which enjoy the pleasant property of being the largest full subcategory of ω-algebraic domains that is closed under the usual domain constructions.) In this paper we resolve this question in the affirmative.