L
Llion Jones
Researcher at Google
Publications - 28
Citations - 64176
Llion Jones is an academic researcher from Google. The author has contributed to research in topics: Machine translation & Deep learning. The author has an hindex of 17, co-authored 26 publications receiving 31515 citations.
Papers
More filters
Proceedings Article
Attention is All you Need
Ashish Vaswani,Noam Shazeer,Niki Parmar,Jakob Uszkoreit,Llion Jones,Aidan N. Gomez,Lukasz Kaiser,Illia Polosukhin +7 more
TL;DR: This paper proposed a simple network architecture based solely on an attention mechanism, dispensing with recurrence and convolutions entirely and achieved state-of-the-art performance on English-to-French translation.
Posted Content
Attention Is All You Need
Ashish Vaswani,Noam Shazeer,Niki Parmar,Jakob Uszkoreit,Llion Jones,Aidan N. Gomez,Lukasz Kaiser,Illia Polosukhin +7 more
TL;DR: A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Journal ArticleDOI
Natural Questions: A Benchmark for Question Answering Research
Tom Kwiatkowski,Jennimaria Palomaki,Olivia Redfield,Michael Collins,Ankur P. Parikh,Chris Alberti,Danielle Epstein,Illia Polosukhin,Jacob Devlin,Kenton Lee,Kristina Toutanova,Llion Jones,Matthew Kelcey,Ming-Wei Chang,Andrew M. Dai,Jakob Uszkoreit,Quoc V. Le,Slav Petrov +17 more
TL;DR: The Natural Questions corpus, a question answering data set, is presented, introducing robust metrics for the purposes of evaluating question answering systems; demonstrating high human upper bounds on these metrics; and establishing baseline results using competitive methods drawn from related literature.
Posted ContentDOI
ProtTrans: Towards Cracking the Language of Life’s Code Through Self-Supervised Deep Learning and High Performance Computing
Ahmed Elnaggar,Michael Heinzinger,Christian Dallago,Ghalia Rihawi,Yu Wang,Llion Jones,Tom Gibbs,Tamas Feher,Christoph Angerer,Debsindhu Bhowmik,Burkhard Rost +10 more
TL;DR: In this paper, the authors trained two auto-regressive language models (Transformer-XL and XLNet) on 80 billion amino acids from 200 million protein sequences (UniRef100) and one auto-encoder model on 393 billion amino acid from 2.1 billion protein sequences taken from the Big Fat Database (BFD).
Proceedings Article
Tensor2Tensor for Neural Machine Translation
Ashish Vaswani,Samy Bengio,Eugene Brevdo,François Chollet,Aidan N. Gomez,Stephan Gouws,Llion Jones,Łukasz Kaiser,Nal Kalchbrenner,Niki Parmar,Ryan Sepassi,Noam Shazeer,Jakob Uszkoreit +12 more
TL;DR: Tensor2Tensor as mentioned in this paper is a library for deep learning models that is well-suited for neural machine translation and includes the reference implementation of the state-of-the-art Transformer model.