scispace - formally typeset
Search or ask a question
Author

김상혁 ( Kim Sang Hyeog )

Bio: 김상혁 ( Kim Sang Hyeog ) is an academic researcher. The author has an hindex of 1, co-authored 1 publications receiving 586 citations.

Papers
More filters

Cited by
More filters
Book
01 Jan 2005
TL;DR: This paper aims to link syntactic and semantic representations in simple sentences and the structure of complex sentences by linking syntax and semantics in complex sentences.
Abstract: Language is a system of communication in which grammatical structures function to express meaning in context. While all languages can achieve the same basic communicative ends, they each use different means to achieve them, particularly in the divergent ways that syntax, semantics and pragmatics interact across languages. This book looks in detail at how structure, meaning, and communicative function interact in human languages. Working within the framework of Role and Reference Grammar (RRG), Van Valin proposes a set of rules, called the 'linking algorithm', which relates syntactic and semantic representations to each other, with discourse-pragmatics playing a role in the linking. Using this model, he discusses the full range of grammatical phenomena, including the structures of simple and complex sentences, verb and argument structure, voice, reflexivization and extraction restrictions. Clearly written and comprehensive, this book will be welcomed by all those working on the interface between syntax, semantics and pragmatics.

575 citations

BookDOI
01 Jan 2001
TL;DR: This work explains Morphosyntactic Competition, the Structure of DPs, and the Natures of Nonconfigurationality in Modern Transformational Syntax, as well as investigating the role of rhetoric in the development of knowledge representation.
Abstract: Contributors. Introduction. Part I: Derivation Versus Representation:. 1. Explaining Morphosyntactic Competition: Joan Bresnan (Stanford University). 2. Economy Conditions in Syntax: Chris Collins (Cornell University). 3. Derivation and Representation in Modern Transformational Syntax: Howard Lasnik (University of Connecticut). 4. Relativized Minimality Effects: Luigi Rizzi (Universite de Geneve). Part II: Movement:. 5. Head Movement: Ian Roberts (University of Stuttgart). 6. Object Shift and Scrambling: Hoskuldur Thrainsson (University of Iceland). 7. Wh--in--situ Languages: Akira Watanabe (University of Tokyo). 8. A--Movements: Mark Baltin (New York University). Part III: Argument Structure and Phrase Structure:. 9. Thematic Relations in Syntax: Jeffrey S. Gruber (independent scholar). 10. Predication: John Bowers (Cornell University). 11. Case: Hiroyuki Ura. 12. Phrase Structure: Naoki Fukui (University of California). 13. The Natures of Nonconfigurationality: Mark C. Baker (McGill University). 14. What VP Ellipsis Can Do, and What it Can't, but not Why: Kyle Johnson (University of Massachusetts at Amherst). Part IV: Functional Projections:. 15. Agreement Projections: Adriana Belletti (Universita di Siena). 16. Sentential Negation: Raffaella Zanuttini (Georgetown University). 17. The DP Hypothesis: Identifying Clausal Properties in the Nominal Domain: Judy B. Bernstein (Syracuse University). 18. The Structure of DPs: Some Principles, Parameters and Problems: Giuseppe Longobardi (University of Trieste). Part V: Interface With Interpretation:. 19. The Syntax of Scope: Anna Szabolcsi (New York University). 20. Deconstructing Binding: Eric Reuland and Martin Everaert (both Utrecht Institute of Linguistics). 21. Syntactic Reconstruction Effects: Andrew Barss (University of Arizona). Part VI: External Evaluation of Syntax:. 22. Syntactic Change: Anthony S. Kroch (University of Pennsylvania). 23. Setting Syntactic Parameters: Janet Dean Fodor (City University of New York). Bibliography. Index.

568 citations

Journal ArticleDOI
TL;DR: A neurocognitive model of online comprehension that accounts for cross-linguistic unity and diversity in the processing of core constituents (verbs and arguments) and can derive the appearance of similar neurophysiological and neuroanatomical processing correlates in seemingly disparate structures in different languages.
Abstract: Real-time language comprehension is a principal cognitive ability and thereby relates to central properties of the human cognitive architecture. Yet how do the presumably universal cognitive and neural substrates of language processing relate to the astounding diversity of human languages (over 5,000)? The authors present a neurocognitive model of online comprehension, the extended argument dependency model (eADM), that accounts for cross-linguistic unity and diversity in the processing of core constituents (verbs and arguments). The eADM postulates that core constituent processing proceeds in three hierarchically organized phases: (1) constituent structure building without relational interpretation, (2) argument role assignment via a restricted set of cross-linguistically motivated information types (e.g., case, animacy), and (3) completion of argument interpretation using information from further domains (e.g., discourse context, plausibility). This basic architecture is assumed to be universal, with cross-linguistic variation deriving primarily from the information types applied in Phase 2 of comprehension. This conception can derive the appearance of similar neurophysiological and neuroanatomical processing correlates in seemingly disparate structures in different languages and, conversely, of cross-linguistic differences in the processing of similar sentence structures.

389 citations

Journal ArticleDOI
TL;DR: It is argued that morphological case realizes abstract Case features in a postsyntactic morphology, according to the Elsewhere Condition, and proposals that case and agreement are purely morphological phenomena are critiqued.
Abstract: This article explores the relationship between abstract Case and morphological case. I argue that abstract Case features are determined syntactically and realized in a postsyntactic morphological component. This morphological realization of abstract Case features is governed by the Elsewhere Condition (Anderson 1969, Kiparsky 1973, Halle and Marantz 1993, Halle 1997), resulting in an imperfect relationship between syntax and morphology, but one that is as faithful as possible given the morphological resources of the language. The data used in the argumentation come primarily from ergative languages. I identify a class of prima facie ergative-absolutive languages in which absolutive—that is, a case that groups together intransitive subjects and transitive objects—does not exist, either as an abstract Case or as a morphological case. Instead, the ''absolutive'' is the default morphological realization of abstract Case features, used when no realization of the specific Case feature is available. This morphological default is inserted for both nominative Case on the intransitive subject and accusative Case on the transitive object. The situation is thus entirely parallel to that

294 citations

Journal ArticleDOI
TL;DR: This paper attempts to articulate the essential nature of the notion ‘root’ in the morphosyntax, and argues that roots must be individuated purely abstractly, as independent indices on the √ node in the syntactic computation that serves as the linkage between a particular set of spell-out instructions and a particularSet of interpretive instructions.
Abstract: Abstract This paper attempts to articulate the essential nature of the notion ‘root’ in the morphosyntax. Adopting a realizational (Late Insertion) view of the morphosyntactic model, the question of whether roots are phonologically individuated, semantically individuated, or not individuated at all in the syntactic component are addressed in turn. It is argued that roots cannot be phonologically identified, since there are suppletive roots, and they cannot be semantically identified, since there are roots with highly variable semantic content, analogous to ‘semantic suppletion'. And yet, they must be individuated in the syntax, since without such individuation, suppletive competition would be impossible. Roots must therefore be individuated purely abstractly, as independent indices on the √ node in the syntactic computation that serves as the linkage between a particular set of spell-out instructions and a particular set of interpretive instructions. It is further argued that the syntactic √node behaves in a syntactically unexceptional way, merging with complement phrases and projecting a √P. The correct formulation of locality restrictions on idiosyncratic phonological and semantic interpretations are also discussed.

250 citations