Proceedings ArticleDOI
A word at a time: computing word relatedness using temporal semantic analysis
Kira Radinsky,Eugene Agichtein,Evgeniy Gabrilovich,Shaul Markovitch +3 more
- pp 337-346
Reads0
Chats0
TLDR
This paper proposes a new semantic relatedness model, Temporal Semantic Analysis (TSA), which captures this temporal information in word semantics as a vector of concepts over a corpus of temporally-ordered documents.Citations
More filters
Journal ArticleDOI
Improving Distributional Similarity with Lessons Learned from Word Embeddings
TL;DR: It is revealed that much of the performance gains of word embeddings are due to certain system design choices and hyperparameter optimizations, rather than the embedding algorithms themselves, and these modifications can be transferred to traditional distributional models, yielding similar gains.
Journal ArticleDOI
Multimodal distributional semantics
TL;DR: This work proposes a flexible architecture to integrate text- and image-based distributional information, and shows in a set of empirical tests that the integrated model is superior to the purely text-based approach, and it provides somewhat complementary semantic information with respect to the latter.
Proceedings ArticleDOI
Improving Vector Space Word Representations Using Multilingual Correlation
Manaal Faruqui,Chris Dyer +1 more
TL;DR: This paper argues that lexico-semantic content should additionally be invariant across languages and proposes a simple technique based on canonical correlation analysis (CCA) for incorporating multilingual evidence into vectors generated monolingually.
Proceedings ArticleDOI
Learning Gender-Neutral Word Embeddings
TL;DR: This article proposed a novel training procedure for learning gender-neutral word embeddings, which aims to preserve gender information in certain dimensions of word vectors while compelling other dimensions to be free of gender influence.
Proceedings Article
Question Answering Using Enhanced Lexical Semantic Models
TL;DR: This work focuses on improving the performance using models of lexical semantic resources and shows that these systems can be consistently and significantly improved with rich lexical semantics information, regardless of the choice of learning algorithms.
References
More filters
Journal ArticleDOI
WordNet : an electronic lexical database
TL;DR: The lexical database: nouns in WordNet, Katherine J. Miller a semantic network of English verbs, and applications of WordNet: building semantic concordances are presented.
Journal ArticleDOI
Indexing by Latent Semantic Analysis
TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Book
Modern Information Retrieval
TL;DR: In this article, the authors present a rigorous and complete textbook for a first course on information retrieval from the computer science (as opposed to a user-centred) perspective, which provides an up-to-date student oriented treatment of the subject.
Proceedings Article
An Information-Theoretic Definition of Similarity
TL;DR: This work presents an informationtheoretic definition of similarity that is applicable as long as there is a probabilistic model and demonstrates how this definition can be used to measure the similarity in a number of different domains.
Proceedings Article
Using dynamic time warping to find patterns in time series
Donald J. Berndt,James Clifford +1 more
TL;DR: Preliminary experiments with a dynamic programming approach to pattern detection in databases, based on the dynamic time warping technique used in the speech recognition field, are described.