scispace - formally typeset
Open AccessJournal Article

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

TLDR
This article introduced a unified framework that converts all text-based language problems into a text-to-text format and compared pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks.
Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts all text-based language problems into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new ``Colossal Clean Crawled Corpus'', we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our data set, pre-trained models, and code.

read more

Content maybe subject to copyright    Report

Citations
More filters

A Comprehensive Assessment of Dialog Evaluation Metrics.

TL;DR: This paper provided a comprehensive assessment of dialog evaluation metrics on a number of datasets and evaluated 23 different automatic evaluation metrics in different settings, to better qualify their respective strengths and weaknesses and suggest promising directions for future work.
Posted Content

Discrete and Soft Prompting for Multilingual Models

TL;DR: This paper showed that discrete and soft prompting perform better than finetuning in multilingual cases: crosslingual transfer and in-language training of multilingual natural language inference, and also demonstrate good performance of prompting with training data in multiple languages other than English.
Posted Content

Entity-Based Knowledge Conflicts in Question Answering

TL;DR: This article proposed a framework to mitigate over-reliance on parametric knowledge, which minimizes hallucination, and improves out-of-distribution generalization by 4-7%.
Proceedings Article

Definition Modelling for Appropriate Specificity.

TL;DR: In this paper, a pre-trained encoder-decoder model, namely Text-to-Text Transfer Transformer (T2TTRT), was proposed to model specificity in definitions.
Proceedings ArticleDOI

Sentence Punctuation for Collaborative Commentary Generation in Esports Live-Streaming

TL;DR: The authors presented two strategies for sentence punctuation for text sequences of game commentary, that is, punctuating sentences by two or three text sequences originally punctuated by Youtube to obtain a complete sentence of commentary.
Related Papers (5)
Trending Questions (1)
What are the limitations of transfer learning with a unified text-to-text transformer?

The paper does not mention the limitations of transfer learning with a unified text-to-text transformer.