scispace - formally typeset
M

Macduff Hughes

Researcher at Google

Publications -  17
Citations -  8601

Macduff Hughes is an academic researcher from Google. The author has contributed to research in topics: Machine translation & Sentence. The author has an hindex of 9, co-authored 16 publications receiving 6977 citations. Previous affiliations of Macduff Hughes include Adobe Systems.

Papers
More filters
Posted Content

Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation

TL;DR: GNMT, Google's Neural Machine Translation system, is presented, which attempts to address many of the weaknesses of conventional phrase-based translation systems and provides a good balance between the flexibility of "character"-delimited models and the efficiency of "word"-delicited models.
Journal ArticleDOI

Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

TL;DR: This work proposes a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages using a shared wordpiece vocabulary, and introduces an artificial token at the beginning of the input sentence to specify the required target language.
Posted Content

Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

TL;DR: The authors propose to add an artificial token at the beginning of the input sentence to specify the required target language, which improves the translation quality of all involved language pairs, even while keeping the total number of model parameters constant.
Proceedings ArticleDOI

The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation

TL;DR: In this article, the authors identify several key modeling and training techniques, and apply them to the RNN architecture, yielding a new RNMT+ model that outperforms all of the three fundamental architectures on the benchmark WMT’14 English to French and English to German tasks.
Posted Content

The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation

TL;DR: This paper identifies several key modeling and training techniques, and applies them to the RNN architecture, yielding a new RNMT+ model that outperforms all of the three fundamental architectures on the benchmark WMT’14 English to French and English to German tasks.