M
Maha Elbayad
Researcher at University of Grenoble
Publications - 18
Citations - 518
Maha Elbayad is an academic researcher from University of Grenoble. The author has contributed to research in topics: Computer science & Machine translation. The author has an hindex of 7, co-authored 12 publications receiving 182 citations.
Papers
More filters
Journal ArticleDOI
No Language Left Behind: Scaling Human-Centered Machine Translation
Nllb team,Marta R. Costa-jussà,James Cross,Onur cCelebi,Maha Elbayad,Kenneth Heafield,Kevin Heffernan,Elahe Kalbassi,Janice Si-Man Lam,Daniel Licht,Jean Maillard,Anna Sun,Skyler Wang,Guillaume Wenzek,Alison Youngblood,Bapi Akula,Loïc Barrault,Gabriel Mejia Gonzalez,Prangthip Hansanti,John Hoffman,Semarley Jarrett,Kaushik Ram Sadagopan,Dirk Rowe,Shannon Spruit,Chau Tran,Pierre Andrews,Necip Fazil Ayan,Shruti Bhosale,Sergey Edunov,Angela Fan,Cynthia Gao,Vedanuj Goswami,Francisco Guzm'an,Philipp Koehn,Alexandre Mourachko,Christophe Ropers,Safiyyah Saleem,Holger Schwenk,Jeff Wang +38 more
TL;DR: A conditional compute model based on Sparsely Gated Mixture of Experts that is trained on data obtained with novel and effective data mining techniques tailored for low-resource languages is developed, laying important groundwork towards realizing a universal translation system.
Proceedings ArticleDOI
Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
TL;DR: This work proposes an alternative approach which instead relies on a single 2D convolutional neural network across both sequences, which outperforms state-of-the-art encoder-decoder systems, while being conceptually simpler and having fewer parameters.
Proceedings ArticleDOI
Findings of the IWSLT 2022 Evaluation Campaign
Antonios Anastasopoulos,Loïc Barrault,Luisa Bentivogli,Marcely Zanon Boito,Ondřej Bojar,Roldano Cattoni,Anna Currey,Georgiana Dinu,Kevin K. Duh,Maha Elbayad,Clara Emmanuel,Yannick Estève,Margarita Frederico,Christian Federmann,Souhir Gahbiche,Hongyu Gong,Roman Grundkiewicz,Barry Haddow,Benjamin Hsu,Dávid Javorský,Věra Kloudová,Surafel Melaku Lakew,Xutai Ma,Prashant Mathur,Paul McNamee,Kenton Murray,Maria Nadejde,Satoshi Nakamura,M. Cristina Negri,Jan Niehues,Xing Niu,John Ortega,Juan Pino,Elizabeth Salesky,Jiatong Shi,Matthias Sperber,Sebastian Stüker,K. Sudoh,Marco Turchi,Yogesh Virkar,Alex Waibel,Chang Wang,Shinji Watanabe +42 more
TL;DR: For each shared task of the 19th International Conference on Spoken Language Translation, the purpose of the task, the data that were released, the evaluation metrics that were applied, the submissions that were received and the results that were achieved are detailed.
Proceedings Article
Depth-Adaptive Transformer
TL;DR: The authors train Transformer models which can make output predictions at different stages of the network and investigate different ways to predict how much computation is required for a particular sequence, which can adjust both the amount of computation as well as the model capacity.
Posted Content
Depth-Adaptive Transformer.
TL;DR: This paper trains Transformer models which can make output predictions at different stages of the network and investigates different ways to predict how much computation is required for a particular sequence.