J
Jakob Uszkoreit
Researcher at Google
Publications - 85
Citations - 83076
Jakob Uszkoreit is an academic researcher from Google. The author has contributed to research in topics: Machine translation & Transformer (machine learning model). The author has an hindex of 36, co-authored 84 publications receiving 37432 citations. Previous affiliations of Jakob Uszkoreit include University of California, Berkeley.
Papers
More filters
Proceedings Article
Distributed Word Clustering for Large Scale Class-Based Language Modeling in Machine Translation
Jakob Uszkoreit,Thorsten Brants +1 more
TL;DR: This paper introduces a modification of the exchange clustering algorithm with improved eciency for certain partially class-based models and a distributed version of this algorithm to eciently obtain automatic word classifications for large vocabularies using such large training corpora.
Posted Content
KERMIT: Generative Insertion-Based Modeling for Sequences
TL;DR: KERMIT is presented, a simple insertion-based approach to generative modeling for sequences and sequence pairs that is capable of matching or exceeding the performance of dedicated state-of-the-art systems across a wide range of tasks without the need for problem-specific architectural adaptation.
Patent
Fast decoding in sequence models using discrete latent variables
Posted Content
Music Transformer
Cheng-Zhi Anna Huang,Ashish Vaswani,Jakob Uszkoreit,Noam Shazeer,Ian Simon,Curtis Hawthorne,Andrew M. Dai,Matthew D. Hoffman,Monica Dinculescu,Douglas Eck +9 more
TL;DR: It is demonstrated that a Transformer with the modified relative attention mechanism can generate minute-long compositions with compelling structure, generate continuations that coherently elaborate on a given motif, and in a seq2seq setup generate accompaniments conditioned on melodies.
Proceedings Article
Inducing Sentence Structure from Parallel Corpora for Reordering
John DeNero,Jakob Uszkoreit +1 more
TL;DR: This paper presents a method for inducing parse trees automatically from a parallel corpus, instead of using a supervised parser trained on a tree-bank, showing that the syntactic structure which is relevant to MT pre-ordering can be learned automatically from parallel text, thus establishing a new application for unsupervised grammar induction.