Open AccessJournal Article
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel,Noam Shazeer,Adam Roberts,Katherine Lee,Sharan Narang,Michael Matena,Yanqi Zhou,Wei Li,Peter J. Liu +8 more
TLDR
This article introduced a unified framework that converts all text-based language problems into a text-to-text format and compared pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks.Abstract:
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts all text-based language problems into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new ``Colossal Clean Crawled Corpus'', we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our data set, pre-trained models, and code.read more
Citations
More filters
Posted Content
A Survey of Transformers.
TL;DR: A comprehensive review of various X-formers can be found in this article, where the vanilla Transformer is briefly introduced and then a new taxonomy of X-forms is proposed.
Posted Content
Learning to Evaluate Translation Beyond English: BLEURT Submissions to the WMT Metrics 2020 Shared Task
Thibault Sellam,Amy Pu,Hyung Won Chung,Sebastian Gehrmann,Qijun Tan,Markus Freitag,Dipanjan Das,Ankur P. Parikh +7 more
TL;DR: This paper describes its contribution to the WMT 2020 Metrics Shared Task, the main benchmark for automatic evaluation of translation, and makes several submissions based on BLEURT, a previously published which uses transfer learning.
Posted Content
Efficient Meta Lifelong-Learning with Limited Memory
TL;DR: This paper identifies three common principles of lifelong learning methods and proposes an efficient meta-lifelong framework that combines them in a synergistic fashion and alleviates both catastrophic forgetting and negative transfer at the same time.
Proceedings Article
Distilling Knowledge from Reader to Retriever for Question Answering
Gautier Izacard,Edouard Grave +1 more
TL;DR: This paper proposed a technique to learn retriever models for downstream tasks, inspired by knowledge distillation, which does not require annotated pairs of query and documents and leverages attention scores of a reader model, used to solve the task based on retrieved documents.
Proceedings ArticleDOI
Self-Supervised Test-Time Learning for Reading Comprehension.
TL;DR: This work considers the task of unsupervised reading comprehension and presents a method that performs “test-time learning” (TTL) on a given context (text passage), without requiring training on large-scale human-authored datasets containing context-question-answer triplets.