Better Word Representations with Recursive Neural Networks for Morphology
Citations
30,558 citations
Cites background or methods from "Better Word Representations with Re..."
...These include WordSim-353 (Finkelstein et al., 2001), MC (Miller and Charles, 1991), RG (Rubenstein and Goodenough, 1965), SCWS (Huang et al., 2012), and RW (Luong et al., 2013)....
[...]
...We conduct experiments on the word analogy task of Mikolov et al. (2013a), a variety of word similarity tasks, as described in (Luong et al., 2013), and on the CoNLL-2003 shared benchmark dataset for NER (Tjong Kim Sang and De Meulder, 2003)....
[...]
...(2013a), a variety of word similarity tasks, as described in (Luong et al., 2013), and on the CoNLL-2003 shared benchmark Table 2: Results on the word analogy task, given as percent accuracy....
[...]
7,537 citations
6,898 citations
1,499 citations
Cites background from "Better Word Representations with Re..."
...For example Luong, Socher, and Manning (2013) apply a recursive neural network over morpheme embeddings to obtain the embedding for a single word....
[...]
...A specific class of FNLMs leverages morphemic information by viewing a word as a function of its (learned) morpheme embeddings (Luong, Socher, and Manning 2013; Botha and Blunsom 2014; Qui et al. 2014)....
[...]
...Unlike previous works that utilize subword information via morphemes (Botha and Blunsom 2014; Luong et al. 2013), our model does not require morphological tagging as a pre-processing step....
[...]
...Unlike previous works that utilize subword information via morphemes (Botha and Blunsom 2014; Luong, Socher, and Manning 2013), our model does not require morphological tagging as a pre-processing step....
[...]
...A specific class of FNLMs leverages morphemic information by viewing a word as a function of its (learned) morpheme embeddings (Luong et al. 2013; Botha and Blunsom 2014; Qui et al. 2014)....
[...]
1,374 citations
References
15,068 citations
"Better Word Representations with Re..." refers background in this paper
...First, a WordNet synset of word1 is randomly selected, and we construct a set of candidates which connect to that synset through various relations, e.g., hypernyms, hyponyms, holonyms, meronyms, and attributes....
[...]
...To counter such problems, each word selected is required to have a non-zero number of synsets in WordNet(Miller, 1995)....
[...]
6,832 citations
6,734 citations
"Better Word Representations with Re..." refers background or methods in this paper
...In our experiments, we make use of two publicly-available embeddings (50- dimensional) provided by(Collobert et al., 2011) (denoted as C&W)8 and Huang et al. (2012) (referred as HSMN)9....
[...]
...This is particularly true in deep neural network models (Collobert et al., 2011), but it is also true in conventional feature-based models (Koo et al., 2008; Ratinov and Roth, 2009)....
[...]
...Such a ranking criterion influences the model to assign higher scores to valid ngrams than to 4“fortunate”, “the”, “bank”, “was”, and “close”. invalid ones and has been demonstrated in (Collobert et al., 2011) to be both efficient and effective in learning word representations....
[...]
...This is particularly true in deep neural network models (Collobert et al., 2011), but it is also true in conventional feature-based models (Koo et al....
[...]
...Collobert et al. (2011) enhanced word vectors with additional character-level features such as capitalization but still can not recover more detailed semantics for very rare or unseen words, which is the focus of this work....
[...]
5,759 citations
5,751 citations