S
Stephan Gouws
Researcher at Google
Publications - 22
Citations - 8174
Stephan Gouws is an academic researcher from Google. The author has contributed to research in topics: Deep learning & Language model. The author has an hindex of 15, co-authored 22 publications receiving 6881 citations. Previous affiliations of Stephan Gouws include Stellenbosch University & Information Sciences Institute.
Papers
More filters
Posted Content
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu,Mike Schuster,Zhifeng Chen,Quoc V. Le,Mohammad Norouzi,Wolfgang Macherey,Maxim Krikun,Yuan Cao,Qin Gao,Klaus Macherey,Jeff Klingner,Apurva Shah,Melvin Johnson,Xiaobing Liu,Łukasz Kaiser,Stephan Gouws,Yoshikiyo Kato,Taku Kudo,Hideto Kazawa,Keith Stevens,George Kurian,Nishant Patil,Wei Wang,Cliff Young,Jason A. Smith,Jason Riesa,Alex Rudnick,Oriol Vinyals,Greg S. Corrado,Macduff Hughes,Jeffrey Dean +30 more
TL;DR: GNMT, Google's Neural Machine Translation system, is presented, which attempts to address many of the weaknesses of conventional phrase-based translation systems and provides a good balance between the flexibility of "character"-delimited models and the efficiency of "word"-delicited models.
Posted Content
Universal Transformers
TL;DR: The authors proposed the Universal Transformer model, which employs a self-attention mechanism in every recursive step to combine information from different parts of a sequence, and further employs an adaptive computation time (ACT) mechanism to dynamically adjust the number of times the representation of each position in a sequence is revised.
Proceedings Article
BilBOWA: Fast Bilingual Distributed Representations without Word Alignments
TL;DR: This paper proposed BilBOWA (Bilingual Bag-of-Words without Alignments), a simple and computationally efficient model for learning bilingual distributed representations of words which can scale to large monolingual datasets and does not require word-aligned parallel training data.
Proceedings Article
Tensor2Tensor for Neural Machine Translation
Ashish Vaswani,Samy Bengio,Eugene Brevdo,François Chollet,Aidan N. Gomez,Stephan Gouws,Llion Jones,Łukasz Kaiser,Nal Kalchbrenner,Niki Parmar,Ryan Sepassi,Noam Shazeer,Jakob Uszkoreit +12 more
TL;DR: Tensor2Tensor as mentioned in this paper is a library for deep learning models that is well-suited for neural machine translation and includes the reference implementation of the state-of-the-art Transformer model.
Posted Content
BilBOWA: Fast Bilingual Distributed Representations without Word Alignments
TL;DR: It is shown that bilingual embeddings learned using the proposed BilBOWA model outperform state-of-the-art methods on a cross-lingual document classification task as well as a lexical translation task on WMT11 data.