Open AccessJournal Article
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel,Noam Shazeer,Adam Roberts,Katherine Lee,Sharan Narang,Michael Matena,Yanqi Zhou,Wei Li,Peter J. Liu +8 more
TLDR
This article introduced a unified framework that converts all text-based language problems into a text-to-text format and compared pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks.Abstract:
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts all text-based language problems into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new ``Colossal Clean Crawled Corpus'', we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our data set, pre-trained models, and code.read more
Citations
More filters
Proceedings ArticleDOI
Interpreting text classifiers by learning context-sensitive influence of words
TL;DR: This work proposes MOXIE (MOdeling conteXt-sensitive InfluencE of words) with an aim to enable a richer interface for a user to interact with the model being interpreted and to produce testable predictions, and aims to make predictions for importance scores, counterfactuals and learned biases withMOXIE.
Proceedings ArticleDOI
Could you give me a hint ? Generating inference graphs for defeasible reasoning
TL;DR: The authors automatically generate such inference graphs through transfer learning from another NLP task that shares the kind of reasoning that inference graphs support, and find that human accuracy on this task improves by 20% by consulting the generated graphs.
Posted Content
Optimizing Transformer for Low-Resource Neural Machine Translation
Ali Araabi,Christof Monz +1 more
TL;DR: This paper showed that using an optimized Transformer model for low-resource conditions improves the translation quality up to 7.3 BLEU points compared to using the Transformer default settings.
Posted Content
Relational World Knowledge Representation in Contextual Language Models: A Review
Tara Safavi,Danai Koutra +1 more
TL;DR: This article propose a taxonomy for relational knowledge representation in contextual LMs based on the level of knowledge supervision provided, considering both works that probe LMs for implicit relational knowledge acquired during self-supervised pretraining on unstructured text alone, and works that explicitly supervise LMs at the levels of relational knowledge entities and relations.
Proceedings ArticleDOI
Applying the T5 language model and duration units normalization to address temporal common sense understanding on the MCTACO dataset
TL;DR: The approach is called T5NCSU (T5 Normalization Common Sense Understanding), which relies on preprocessing techniques like duration units normalization and the use of the recently released T5 text-to-text pre-trained language model to obtain the state-of-the-art on the MCTACO dataset leaderboard.