scispace - formally typeset
Open AccessJournal Article

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Reads0
Chats0
TLDR
This article introduced a unified framework that converts all text-based language problems into a text-to-text format and compared pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks.
Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts all text-based language problems into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled data sets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new ``Colossal Clean Crawled Corpus'', we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our data set, pre-trained models, and code.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Generative Context Pair Selection for Multi-hop Question Answering.

TL;DR: This article proposed a generative context selection model for multi-hop question answering that reasons about how the given question could have been generated given a context pair, which has a better performance than the state-of-the-art answering performance.
Proceedings ArticleDOI

Mention Flags (MF): Constraining Transformer-based Text Generators

TL;DR: The authors propose mention flags (MF) which traces whether lexical constraints are satisfied in the generated outputs in an S2S decoder, and achieve state-of-the-art performance on Common Sense Generation (CommonGen), End2end Restaurant Dialog (E2ENLG) and Novel Object Captioning (nocaps) tasks.
Posted Content

MergeBERT: Program Merge Conflict Resolution via Neural Transformers.

TL;DR: MergeBERT as mentioned in this paper is a neural program merge framework based on the token-level three-way differencing and a transformer encoder model, which is able to perform program merge in a multilingual setting with Java, JavaScript, TypeScript and C# programming languages.

PoinT-5: Pointer Network and T-5 based Financial Narrative Summarisation

TL;DR: In this article, the combination of Pointer Network and T-5 (Test-to-text transfer Transformer) algorithms was used to extract important narrative sentences from the report, and then T5 is used toparaphrase extracted sentences into a concise yet informative sentence.
Posted Content

Logic-Consistency Text Generation from Semantic Parses

TL;DR: This paper proposed a framework for logic consistent text generation from semantic parses that employs an iterative training procedure by recursively augmenting the training set with quality control, and proposed a novel automatic metric, BLEC, for evaluating the logical consistency between the semantic parsing and generated texts.
Related Papers (5)
Trending Questions (1)
What are the limitations of transfer learning with a unified text-to-text transformer?

The paper does not mention the limitations of transfer learning with a unified text-to-text transformer.