R
Rico Sennrich
Researcher at University of Zurich
Publications - 200
Citations - 18997
Rico Sennrich is an academic researcher from University of Zurich. The author has contributed to research in topics: Machine translation & Computer science. The author has an hindex of 48, co-authored 185 publications receiving 14563 citations. Previous affiliations of Rico Sennrich include University of Edinburgh.
Papers
More filters
Proceedings ArticleDOI
Zero-Shot Crosslingual Sentence Simplification
TL;DR: A zero-shot modeling framework which transfers simplification knowledge from English to another language (for which no parallel simplification corpus exists) while generalizing across languages and tasks is proposed.
Proceedings ArticleDOI
NusaX: Multilingual Parallel Sentiment Dataset for 10 Indonesian Local Languages
Genta Indra Winata,Alham Fikri Aji,Samuel Cahyawijaya,Rahmad Mahendra,Fajri Koto,Ade Romadhony,Kemal Kurniawan,David Moeljadi,Radityo Eko Prasojo,Pascale Fung,T. E. Baldwin,Jey Han Lau,Rico Sennrich,Sebastian Ruder +13 more
TL;DR: This work develops the first-ever parallel resource for 10 low-resource languages in Indonesia, which includes datasets, a multi-task benchmark, and lexicons, as well as a parallel Indonesian-English dataset.
Proceedings ArticleDOI
Sentence Compression for Arbitrary Languages via Multilingual Pivoting
TL;DR: The use of bilingual corpora which are abundantly available for training sentence compression models are advocated and a new parallel Multilingual Compression dataset is released which can be used to evaluate compression models across languages and genres.
Posted Content
Evaluating Machine Translation Performance on Chinese Idioms with a Blacklist Method
TL;DR: A new evaluation method based on an idiom-specific blacklist of literal translations, based on the insight that the occurrence of any blacklisted words in the translation output indicates a likely translation error, is introduced.
Proceedings ArticleDOI
Sparse Attention with Linear Units
TL;DR: This work introduces a novel, simple method for achieving sparsity in attention: it replaces the softmax activation with a ReLU, and shows that sparsity naturally emerges from such a formulation.