mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
Citations
1,429 citations
407 citations
213 citations
149 citations
133 citations
References
687 citations
"mT5: A Massively Multilingual Pre-t..." refers background or methods in this paper
...To validate the performance of mT5, we evaluate our models on 6 tasks from the XTREME multilingual benchmark (Hu et al., 2020): the XNLI (Conneau et al., 2018) entailment task covering 14 languages; the XQuAD (Artetxe et al., 2020), MLQA (Lewis et al., 2019), and TyDi QA (Clark et al., 2020)…...
[...]
..., 2020): the XNLI (Conneau et al., 2018) entailment task covering 14 languages; the XQuAD (Artetxe et al....
[...]
...Following XLM-R (Conneau et al., 2018), we increase the vocabulary size to 250,000 wordpieces....
[...]
583 citations
"mT5: A Massively Multilingual Pre-t..." refers background in this paper
...Similar unifying frameworks were proposed by Keskar et al. (2019) and McCann et al. (2018)....
[...]
556 citations
"mT5: A Massively Multilingual Pre-t..." refers background or methods in this paper
..., 2018) entailment task covering 14 languages; the XQuAD (Artetxe et al., 2020), MLQA (Lewis et al....
[...]
...…6 tasks from the XTREME multilingual benchmark (Hu et al., 2020): the XNLI (Conneau et al., 2018) entailment task covering 14 languages; the XQuAD (Artetxe et al., 2020), MLQA (Lewis et al., 2019), and TyDi QA (Clark et al., 2020) reading comprehension benchmarks with 10, 6Standard deviations of…...
[...]
461 citations
"mT5: A Massively Multilingual Pre-t..." refers methods in this paper
...…for the “Text-to-Text Transfer Transformer” (T5) model released by Raffel et al. (2020) have been used to achieve state-of-the-art results on many benchmarks (Khashabi et al., 2020; Roberts et al., 2020; Kale, 2020; Izacard and Grave, 2020; Nogueira et al., 2020; Narang et al., 2020, etc.)....
[...]
435 citations