scispace - formally typeset
Open AccessJournal Article

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

TLDR
This article introduced a unified framework that converts all text-based language problems into a text-to-text format and compared pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks.
Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts all text-based language problems into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new ``Colossal Clean Crawled Corpus'', we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our data set, pre-trained models, and code.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Text-to-SQL in the Wild: A Naturally-Occurring Dataset Based on Stack Exchange Data

TL;DR: SEDE as discussed by the authors is a dataset with 12,023 pairs of utterances and SQL queries collected from real usage on the Stack Exchange website, which contains a variety of real-world challenges which were rarely reflected so far in any other semantic parsing dataset.
Book ChapterDOI

Paraphrasing Academic Text: A Study of Back-Translating Anatomy and Physiology with Transformers

TL;DR: The authors explored a general approach to paraphrase generation using a pre-trained seq2seq model fine-tuned using a back-translated anatomy and physiology textbook and found that paraphrase models generally preserved meaning and grammaticality/fluency.
Journal ArticleDOI

Braid: Weaving Symbolic and Neural Knowledge into Coherent Logical Explanations

TL;DR: BRABB as mentioned in this paper is a logical reasoner that supports probabilistic rules and uses the notion of custom unification functions and dynamic rule generation to overcome the brittle matching and knowledge gap problem prevalent in traditional reasoners.
Proceedings ArticleDOI

Evolutionary Hyperparameter Optimisation for Sentence Classification

TL;DR: This article used a simple genetic algorithm to optimize the hyperparameters of three different architectures and evaluated their performance on a suite of sentence classification benchmarks and found that a single genetic algorithm is capable of optimising a variety of different architectures.
Related Papers (5)
Trending Questions (1)
What are the limitations of transfer learning with a unified text-to-text transformer?

The paper does not mention the limitations of transfer learning with a unified text-to-text transformer.