scispace - formally typeset
M

Minh-Thang Luong

Researcher at Google

Publications -  60
Citations -  21035

Minh-Thang Luong is an academic researcher from Google. The author has contributed to research in topics: Machine translation & Supervised learning. The author has an hindex of 35, co-authored 60 publications receiving 15204 citations. Previous affiliations of Minh-Thang Luong include National University of Singapore & Stanford University.

Papers
More filters
Proceedings ArticleDOI

Effective Approaches to Attention-based Neural Machine Translation

TL;DR: A global approach which always attends to all source words and a local one that only looks at a subset of source words at a time are examined, demonstrating the effectiveness of both approaches on the WMT translation tasks between English and German in both directions.
Proceedings ArticleDOI

Self-Training With Noisy Student Improves ImageNet Classification

TL;DR: A simple self-training method that achieves 88.4% top-1 accuracy on ImageNet, which is 2.0% better than the state-of-the-art model that requires 3.5B weakly labeled Instagram images.
Proceedings Article

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators

TL;DR: This paper proposed a more sample-efficient pre-training task called replaced token detection, which corrupts the input by replacing some input tokens with plausible alternatives sampled from a small generator network and then predicts whether each token in the corrupted input was replaced by a generator sample or not.
Posted Content

Unsupervised Data Augmentation for Consistency Training

TL;DR: A new perspective on how to effectively noise unlabeled examples is presented and it is argued that the quality of noising, specifically those produced by advanced data augmentation methods, plays a crucial role in semi-supervised learning.
Posted Content

Effective Approaches to Attention-based Neural Machine Translation

TL;DR: This article proposed two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time.