scispace - formally typeset
Open AccessJournal Article

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Reads0
Chats0
TLDR
This article introduced a unified framework that converts all text-based language problems into a text-to-text format and compared pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks.
Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts all text-based language problems into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new ``Colossal Clean Crawled Corpus'', we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our data set, pre-trained models, and code.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Deep Episodic Memory for Verbalization of Robot Experience

TL;DR: In this paper, an LSTM-based episodic memory auto-encoder was used for the task of robot experience verbalization in a natural language environment, and the model was evaluated on simulated and real data from robot execution examples.
Posted Content

CoRT: Complementary Rankings from Transformers

TL;DR: The authors propose CoRT, a simple neural first-stage ranking model that leverages contextual representations from pretrained language models such as BERT to complement term-based ranking functions while causing no significant delay at query time.
Posted Content

MultiEURLEX -- A multi-lingual and multi-label legal document classification dataset for zero-shot cross-lingual transfer

TL;DR: The MultiI-EURLEX dataset as discussed by the authors contains 65k European Union (EU) laws, annotated with multiple labels from the EUROVOC taxonomy and used as a testbed for zero-shot cross-lingual transfer.
Posted Content

MSG-Transformer: Exchanging Local Spatial Information by Manipulating Messenger Tokens.

TL;DR: Zhang et al. as discussed by the authors proposed a specialized token for each region that serves as a messenger (MSG), which can flexibly exchange visual information across regions and the computational complexity is reduced.
Proceedings Article

Learning to Evaluate Translation Beyond English: BLEURT Submissions to the WMT Metrics 2020 Shared Task

TL;DR: The authors extended the WMT metric beyond English and evaluated it on 14 language pairs for which fine-tuning data is available, as well as 4 “zero-shot” language pairs, for which they have no labelled examples.
Related Papers (5)
Trending Questions (1)
What are the limitations of transfer learning with a unified text-to-text transformer?

The paper does not mention the limitations of transfer learning with a unified text-to-text transformer.