scispace - formally typeset
Search or ask a question
Topic

Semantic similarity

About: Semantic similarity is a research topic. Over the lifetime, 14605 publications have been published within this topic receiving 364659 citations. The topic is also known as: semantic relatedness.


Papers
More filters
Journal ArticleDOI
Mike Nelson1
TL;DR: The authors examined the semantic associations of words found in the business lexical environment by using a one-million word corpus of both spoken and written Business English and found that not only do words in business environment have semantic prosody, that is they are found to regularly collocate with word groups that share semantic similarity, they also have prosodies that are unique to business, separate from the prosodies they generate in the general English environment.

111 citations

Journal ArticleDOI

111 citations

Posted Content
TL;DR: Notably, this simple neural model qualitatively recapitulates many diverse regularities underlying semantic development, while providing analytic insight into how the statistical structure of an environment can interact with nonlinear deep-learning dynamics to give rise to these regularities.
Abstract: An extensive body of empirical research has revealed remarkable regularities in the acquisition, organization, deployment, and neural representation of human semantic knowledge, thereby raising a fundamental conceptual question: what are the theoretical principles governing the ability of neural networks to acquire, organize, and deploy abstract knowledge by integrating across many individual experiences? We address this question by mathematically analyzing the nonlinear dynamics of learning in deep linear networks. We find exact solutions to this learning dynamics that yield a conceptual explanation for the prevalence of many disparate phenomena in semantic cognition, including the hierarchical differentiation of concepts through rapid developmental transitions, the ubiquity of semantic illusions between such transitions, the emergence of item typicality and category coherence as factors controlling the speed of semantic processing, changing patterns of inductive projection over development, and the conservation of semantic similarity in neural representations across species. Thus, surprisingly, our simple neural model qualitatively recapitulates many diverse regularities underlying semantic development, while providing analytic insight into how the statistical structure of an environment can interact with nonlinear deep learning dynamics to give rise to these regularities.

111 citations

Proceedings ArticleDOI
04 Dec 2012
TL;DR: This paper proposes a novel unsupervised context-based approach to detecting emotion from text at the sentence level that is flexible enough to classify sentences beyond Ekman's model of six basic emotions.
Abstract: Emotion detection from text is a relatively new classification task. This paper proposes a novel unsupervised context-based approach to detecting emotion from text at the sentence level. The proposed methodology does not depend on any existing manually crafted affect lexicons such as Word Net-Affect, thereby rendering our model flexible enough to classify sentences beyond Ekman's model of six basic emotions. Our method computes an emotion vector for each potential affect bearing word based on the semantic relatedness between words and various emotion concepts. The scores are then fine tuned using the syntactic dependencies within the sentence structure. Extensive evaluation on various data sets shows that our framework is a more generic and practical solution to the emotion classification problem and yields significantly more accurate results than recent unsupervised approaches.

111 citations

Journal ArticleDOI
TL;DR: It was found that number of features and contexts consistently facilitated word recognition but that the effects of semantic neighborhood density and number of associates were less robust, which point to how the results are selectively and adaptively modulated by task-specific demands.
Abstract: Evidence from large-scale studies (Pexman, Hargreaves, Siakaluk, Bodner, & Pope, 2008) suggests that semantic richness, a multidimensional construct reflecting the extent of variability in the information associated with a word's meaning, facilitates visual word recognition. Specifically, recognition is better for words that (1) have more semantic neighbors, (2) possess referents with more features, and (3) are associated with more contexts. The present study extends Pexman et al. (2008) by examining how two additional measures of semantic richness, number of senses and number of associates (Pexman, Hargreaves, Edwards, Henry, & Goodyear, 2007), influence lexical decision, speeded pronunciation, and semantic classification performance, after controlling for an array of lexical and semantic variables. We found that number of features and contexts consistently facilitated word recognition but that the effects of semantic neighborhood density and number of associates were less robust. Words with more senses also elicited faster lexical decisions but less accurate semantic classifications. These findings point to how the effects of different semantic dimensions are selectively and adaptively modulated by task-specific demands.

111 citations


Network Information
Related Topics (5)
Web page
50.3K papers, 975.1K citations
84% related
Graph (abstract data type)
69.9K papers, 1.2M citations
84% related
Unsupervised learning
22.7K papers, 1M citations
83% related
Feature vector
48.8K papers, 954.4K citations
83% related
Web service
57.6K papers, 989K citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023202
2022522
2021641
2020837
2019866
2018787