scispace - formally typeset
Search or ask a question
Topic

Head (linguistics)

About: Head (linguistics) is a research topic. Over the lifetime, 2540 publications have been published within this topic receiving 29023 citations. The topic is also known as: nucleus.


Papers
More filters
Book ChapterDOI
03 Apr 2016
TL;DR: This paper employs discriminative large-margin and sequence modeling with pivot features for issue sentence classification and issue phrase boundary extraction and demonstrates the effectiveness of the proposed approach.
Abstract: Subjective expression extraction is a central problem in fine-grained sentiment analysis. Most existing works focus on generic subjective expression extraction as opposed to aspect specific opinion phrase extraction. Given the ever-growing product reviews domain, extracting aspect specific opinion phrases is important as it yields the key product issues that are often mentioned via phrases (e.g., “signal fades very quickly,” “had to flash the firmware often”). In this paper, we solve the problem using a combination of generative and discriminative modeling. The generative model performs a first level processing facilitating (1) discovery of potential head aspects containing issues, (2) generation of a labeled dataset of issue phrases, and (3) feed latent semantic features to subsequent discriminative modeling. We then employ discriminative large-margin and sequence modeling with pivot features for issue sentence classification and issue phrase boundary extraction. Experimental results using real-world reviews from Amazon.com demonstrate the effectiveness of the proposed approach.

5 citations

Journal ArticleDOI
TL;DR: A statistical model based on the three head words --- the verb head, the preposition, and the noun head --- for detecting article errors, which Japanese learners of English often make in English writing is proposed.
Abstract: In this paper, we propose a statistical model for detecting article errors, which Japanese learners of English often make in English writing. It is based on the three head words --- the verb head, the preposition, and the noun head. To overcome the data sparseness problem, we apply the backed-off estimate to it. Experiments show that its performance (F-measure=0.70) is better than that of other methods. Apart from the performance, it has two advantages: (i) Rules for detecting article errors are automatically generated as conditional probabilities once a corpus is given; (ii) Its recall and precision rates are adjustable.

5 citations

Journal ArticleDOI
04 Mar 2021-PLOS ONE
TL;DR: This paper investigated the temporal evolution of noun and adjective representations during speech planning and found that nouns were generally more decodable than adjectives, suggesting that noun representations were stronger and/or more consistent across trials than those of adjectives.
Abstract: In language, stored semantic representations of lexical items combine into an infinitude of complex expressions. While the neuroscience of composition has begun to mature, we do not yet understand how the stored representations evolve and morph during composition. New decoding techniques allow us to crack open this very hard question: we can train a model to recognize a representation in one context or time-point and assess its accuracy in another. We combined the decoding approach with magnetoencephalography recorded during a picture naming task to investigate the temporal evolution of noun and adjective representations during speech planning. We tracked semantic representations as they combined into simple two-word phrases, using single words and two-word lists as non-combinatory controls. We found that nouns were generally more decodable than adjectives, suggesting that noun representations were stronger and/or more consistent across trials than those of adjectives. When training and testing across contexts and times, the representations of isolated nouns were recoverable when those nouns were embedded in phrases, but not so if they were embedded in lists. Adjective representations did not show a similar consistency across isolated and phrasal contexts. Noun representations in phrases also sustained over time in a way that was not observed for any other pairing of word class and context. These findings offer a new window into the temporal evolution and context sensitivity of word representations during composition, revealing a clear asymmetry between adjectives and nouns. The impact of phrasal contexts on the decodability of nouns may be due to the nouns' status as head of phrase-an intriguing hypothesis for future research.

5 citations

01 Jan 2014
TL;DR: This dissertation provides novel evidence that the atoms of syntax are smaller than morphophonological words, which leads to the conclusion words are built out of syntactic objects and, at least in part, by syntactic mechanisms.
Abstract: Author(s): Harizanov, Boris | Advisor(s): Chung, Sandra | Abstract: If both words and phrases are internally complex and can be decomposed into hierarchically organized constituents, what is the relation between the syntactically motivated constituency of phrases and the morphophonologically motivated constituency of words? In particular, is the correspondence between syntactic atoms and morphophonological words one-to-one or, in other words, does syntax only manipulate objects that are as small as words? These questions have generated a long line of productive research that has identified various mismatches between syntax and morphophonology: e.g. while some syntactic atoms are realized as autonomous morphophonological words, others are realized as subparts of words. Such results have, in turn, motivated approaches to word construction that are syntactic in nature.In this dissertation I provide novel evidence that the atoms of syntax are smaller than morphophonological words, which leads to the conclusion words are built out of syntactic objects and, at least in part, by syntactic mechanisms. As far as the cases investigated here are concerned, what gives words their distinctive character and causes them to behave differently from phrases with respect to morphophonology is the application of Morphological Merger. Specifically, syntactically independent objects become the constituent parts of morphophonological words as the result of Morphological Merger, an operation that produces complex heads as part of the mapping from syntax to morphophonology.The evidence I provide in this dissertation allows a particularly direct diagnosis of the syntactic independence of various subconstituents of morphophonological words. More specifically, it involves, for example, the interaction of subwords with syntactic operations (like movement), quantifier stranding, various kinds of binding, and thematic interpretation. Furthermore, while much previous work on complex word formation has centered on words constructed by the combination of a head with its complement (e.g. "incorporation") or with the head of its complement (e.g. "head movement"), this dissertation focuses on a less studied correspondence between syntax and morphophonology: words constructed out of a head and its specifier.The particular view of the syntax-morphophonology interface espoused in this dissertation is developed on the basis of case studies from Bulgarian, a South Slavic language. As a result, a major concern throughout is the description and analysis of a number of important phenomena attested in Bulgarian: cliticization and clitic doubling, deverbal nominalization, and denominal adjectivization, among others. This dissertation provides a unified understanding of these phenomena to the extent that they all involve the syntactic construction of morphophonological words, which are produced by a mapping procedure that involves the application of Morphological Merger.

5 citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
202168
202090
201986
201890
201790