Efficient Natural Language Response Suggestion for Smart Reply
Citations
1,259 citations
Cites background or methods from "Efficient Natural Language Response..."
..., 2015) for the unsupervised learning from arbitrary running text; a conversational input-response task for the inclusion of parsed conversational data (Henderson et al., 2017); and classification tasks for training on supervised data....
[...]
...…supported tasks include: a SkipThought like task (Kiros et al., 2015) for the unsupervised learning from arbitrary running text; a conversational input-response task for the inclusion of parsed conversational data (Henderson et al., 2017); and classification tasks for training on supervised data....
[...]
1,083 citations
Cites background from "Efficient Natural Language Response..."
...Using labeled pairs of queries and documents, discriminatively trained dense encoders have become popular recently (Yih et al., 2011; Huang et al., 2013; Gillick et al., 2019), with applications to cross-lingual document retrieval, ad relevance prediction, Web search and entity retrieval....
[...]
876 citations
Cites background or methods from "Efficient Natural Language Response..."
...See Yang et al. (2018) and Henderson et al. (2017) for additional architectural details of models similar to those presented here....
[...]
...170 2015);5 conversational response prediction (Henderson et al., 2017); and a select supervised classification task that improves sentence embeddings....
[...]
...2015);5 conversational response prediction (Henderson et al., 2017); and a select supervised classification task that improves sentence embeddings.6 The transformer encoder achieves the best transfer performance....
[...]
695 citations
655 citations
Cites methods from "Efficient Natural Language Response..."
...The model is trained to minimize the cross-entropy loss, where the negative candidates for each example are the responses to the other examples in the batch (Henderson et al., 2017)....
[...]
References
72,897 citations
"Efficient Natural Language Response..." refers background in this paper
...[11] is a long short-term memory (LSTM) recurrent neural network [8] – an application of the sequence-to-sequence learning framework (Seq2Seq) [23]....
[...]
30,558 citations
"Efficient Natural Language Response..." refers background in this paper
...Vector representations of words, or word embeddings, have been widely adopted, particularly since the introduction of efficient computational learning algorithms that can derive meaningful embeddings from unlabeled text [15, 17, 20]....
[...]
20,077 citations
11,936 citations
"Efficient Natural Language Response..." refers background in this paper
...This framework provides a direct path for end-to-end learning [23]....
[...]
...[11] is a long short-term memory (LSTM) recurrent neural network [8] – an application of the sequence-to-sequence learning framework (Seq2Seq) [23]....
[...]
10,913 citations