scispace - formally typeset
L

Llion Jones

Researcher at Google

Publications -  28
Citations -  64176

Llion Jones is an academic researcher from Google. The author has contributed to research in topics: Machine translation & Deep learning. The author has an hindex of 17, co-authored 26 publications receiving 31515 citations.

Papers
More filters
Proceedings Article

Attention is All you Need

TL;DR: This paper proposed a simple network architecture based solely on an attention mechanism, dispensing with recurrence and convolutions entirely and achieved state-of-the-art performance on English-to-French translation.
Posted Content

Attention Is All You Need

TL;DR: A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Journal ArticleDOI

Natural Questions: A Benchmark for Question Answering Research

TL;DR: The Natural Questions corpus, a question answering data set, is presented, introducing robust metrics for the purposes of evaluating question answering systems; demonstrating high human upper bounds on these metrics; and establishing baseline results using competitive methods drawn from related literature.
Posted ContentDOI

ProtTrans: Towards Cracking the Language of Life’s Code Through Self-Supervised Deep Learning and High Performance Computing

TL;DR: In this paper, the authors trained two auto-regressive language models (Transformer-XL and XLNet) on 80 billion amino acids from 200 million protein sequences (UniRef100) and one auto-encoder model on 393 billion amino acid from 2.1 billion protein sequences taken from the Big Fat Database (BFD).
Proceedings Article

Tensor2Tensor for Neural Machine Translation

TL;DR: Tensor2Tensor as mentioned in this paper is a library for deep learning models that is well-suited for neural machine translation and includes the reference implementation of the state-of-the-art Transformer model.