A
Armand Joulin
Researcher at Facebook
Publications - 136
Citations - 36652
Armand Joulin is an academic researcher from Facebook. The author has contributed to research in topics: Computer science & Word (computer architecture). The author has an hindex of 55, co-authored 125 publications receiving 25130 citations. Previous affiliations of Armand Joulin include Microsoft & École Normale Supérieure.
Papers
More filters
Proceedings Article
Advances in Pre-Training Distributed Word Representations
TL;DR: The authors used a combination of known tricks that are rarely used together to train pre-trained word vector representations and achieved state-of-the-art performance on a number of NLP tasks.
Posted Content
Bag of Tricks for Efficient Text Classification
TL;DR: A simple and efficient baseline for text classification is explored that shows that the fast text classifier fastText is often on par with deep learning classifiers in terms of accuracy, and many orders of magnitude faster for training and evaluation.
Proceedings ArticleDOI
Unsupervised Joint Object Discovery and Segmentation in Internet Images
TL;DR: This work proposes to use dense correspondences between images to capture the sparsity and visual variability of the common object over the entire database, which enables us to ignore noise objects that may be salient within their own images but do not commonly occur in others.
Posted Content
Learning Word Vectors for 157 Languages
TL;DR: This paper describes how high quality word representations for 157 languages were trained on the free online encyclopedia Wikipedia and data from the common crawl project, and introduces three new word analogy datasets to evaluate these word vectors.
Posted Content
Beyond English-Centric Multilingual Machine Translation
Angela Fan,Shruti Bhosale,Holger Schwenk,Zhiyi Ma,Ahmed El-Kishky,Siddharth Goyal,Mandeep Baines,Onur Celebi,Guillaume Wenzek,Vishrav Chaudhary,Naman Goyal,Tom Birch,Vitaliy Liptchinsky,Sergey Edunov,Edouard Grave,Michael Auli,Armand Joulin +16 more
TL;DR: This work creates a true Many-to-Many multilingual translation model that can translate directly between any pair of 100 languages and explores how to effectively increase model capacity through a combination of dense scaling and language-specific sparse parameters to create high quality models.