scispace - formally typeset
Open AccessPosted Content

Efficient Natural Language Response Suggestion for Smart Reply

Reads0
Chats0
TLDR
A computationally efficient machine-learned method for natural language response suggestion using feed-forward neural networks using n-gram embedding features that achieves the same quality at a small fraction of the computational requirements and latency.
Abstract
This paper presents a computationally efficient machine-learned method for natural language response suggestion. Feed-forward neural networks using n-gram embedding features encode messages into vectors which are optimized to give message-response pairs a high dot-product value. An optimized search finds response suggestions. The method is evaluated in a large-scale commercial e-mail application, Inbox by Gmail. Compared to a sequence-to-sequence approach, the new system achieves the same quality at a small fraction of the computational requirements and latency.

read more

Citations
More filters
Proceedings Article

COUGH: A Challenge Dataset and Models for COVID-19 FAQ Retrieval

TL;DR: Cough as discussed by the authors is a large, challenging dataset for COVID-19 FAQ retrieval, which consists of FAQ Bank, Query Bank and relevance set, and is used to test different FAQ retrieval models built on top of BM25 and BERT.
Proceedings ArticleDOI

Training Effective Neural Sentence Encoders from Automatically Mined Paraphrases

TL;DR: A method for training effective language- specific sentence encoders without manually labeled data is proposed, to automatically construct a dataset of paraphrase pairs from sentence-aligned bilingual text corpora and used to tune a Transformer language model with an additional recurrent pooling layer.
Journal ArticleDOI

RISE: Leveraging Retrieval Techniques for Summarization Evaluation

David C. Uthus, +1 more
- 17 Dec 2022 - 
TL;DR: This article proposed RISE, a new approach for evaluating summaries by leveraging techniques from information retrieval, which is first trained as a retrieval task using a dual-encoder retrieval setup, and can then be subsequently utilized for evaluating a generated summary given an input document, without gold reference summaries.
Journal ArticleDOI

Joint Representations of Text and Knowledge Graphs for Retrieval and Evaluation

Teven Le Scao, +1 more
- 28 Feb 2023 - 
TL;DR: This article proposed EREDAT (Ensembled Representations for Evaluation of DAta-to-Text), a similarity metric between English text and knowledge base (KB) graphs.
Proceedings Article

A Conditional Generative Matching Model for Multi-lingual Reply Suggestion.

TL;DR: This paper proposed Conditional Generative Matching models (CGM), optimized within a variational autoencoder framework, to address challenges arising from multilingual RS and achieved state-of-the-art performance.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings ArticleDOI

Glove: Global Vectors for Word Representation

TL;DR: A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
Posted Content

Efficient Estimation of Word Representations in Vector Space

TL;DR: This paper proposed two novel model architectures for computing continuous vector representations of words from very large data sets, and the quality of these representations is measured in a word similarity task and the results are compared to the previously best performing techniques based on different types of neural networks.
Posted Content

Sequence to Sequence Learning with Neural Networks

TL;DR: This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
Proceedings ArticleDOI

TensorFlow: a system for large-scale machine learning

TL;DR: TensorFlow as mentioned in this paper is a machine learning system that operates at large scale and in heterogeneous environments, using dataflow graphs to represent computation, shared state, and the operations that mutate that state.
Related Papers (5)