Universal Sentence Encoder
Citations
4,020 citations
Cites background or methods from "Universal Sentence Encoder"
...In this section, we compare SBERT to average GloVe embeddings, InferSent (Conneau et al., 2017), and Universal Sentence Encoder (Cer et al., 2018)....
[...]
...We fine-tune SBERT on NLI data, which creates sentence embeddings that significantly outperform other state-of-the-art sentence embedding methods like InferSent (Conneau et al., 2017) and Universal Sentence Encoder (Cer et al., 2018)....
[...]
..., 2017), and Universal Sentence Encoder (Cer et al., 2018)....
[...]
..., 2017) and Universal Sentence Encoder (Cer et al., 2018)....
[...]
...Universal Sentence Encoder (Cer et al., 2018) trains a transformer network and augments unsupervised learning with training on SNLI....
[...]
682 citations
Cites methods from "Universal Sentence Encoder"
...In particular, most of the systems adopted models known to be particularly suitable for dealing with texts, from Recurrent Neural Networks to recently proposed language models (Sabour et al., 2017; Cer et al., 2018)....
[...]
...They trained a SVM model with RBF kernel only on the provided data, exploiting sentence embeddings from Google’s Universal Sentence Encoder (Cer et al., 2018) as features....
[...]
668 citations
Cites background or methods from "Universal Sentence Encoder"
...Following (Cer et al., 2018), we draw the mixed corpus Chinese Wikepedia, Baidu Baike, Baidu news and Baidu Tieba....
[...]
...As shown in figure 3, our method introduces dialogue embedding to identify the roles in the dialogue, which is different from that of universal sentence encoder (Cer et al., 2018)....
[...]
...Universal sentence encoder (Cer et al., 2018) adopts heterogeneous training data drawn from Wikipedia, web news, web QA pages and discussion forum....
[...]
...As shown in figure 3, our method introduces dialogue embedding to identify the roles in the dialogue, which is different from that of universal sentence encoder (Cer et al., 2018)....
[...]
655 citations
Cites methods from "Universal Sentence Encoder"
...The encoded candidates are flattened into vectors using the normalization from Cer et al. (2018) to produce an attention prediction over the memory....
[...]
512 citations
References
72,897 citations
"Universal Sentence Encoder" refers methods in this paper
...The Skip-Thought task replaces the LSTM (Hochreiter and Schmidhuber, 1997) used in the original formulation with a model based on the Transformer architecture....
[...]
30,558 citations
"Universal Sentence Encoder" refers background in this paper
...Similar to GloVe, our model reproduces human associations between flowers vs. insects and pleasantness vs. unpleasantness....
[...]
...However, our model demonstrates weaker associations than GloVe for probes targeted at revealing at ageism, racism and sexism.6 The differences in word association patterns can be attributed to differences in the training data composition and the mixture of tasks used to train the sentence embeddings....
[...]
...Table 4 contrasts Caliskan et al. (2017)’s findings on bias within GloVe embeddings with the DAN variant of the universal encoder....
[...]
...Many models address the problem by implicitly performing limited transfer learning through the use of pre-trained word embeddings such as those produced by word2vec (Mikolov et al., 2013) or GloVe (Pennington et al., 2014)....
[...]
24,012 citations
"Universal Sentence Encoder" refers background or methods in this paper
...of pre-trained word embeddings such as those produced by word2vec (Mikolov et al., 2013) or GloVe (Pennington et al....
[...]
...For word level transfer, we use word embeddings from a word2vec skip-gram model trained on a corpus of news data (Mikolov et al., 2013)....
[...]
...Many models address the problem by implicitly performing limited transfer learning through the use of pre-trained word embeddings such as those produced by word2vec (Mikolov et al., 2013) or GloVe (Pennington et al., 2014)....
[...]
11,343 citations
10,913 citations