scispace - formally typeset
R

Rewon Child

Researcher at OpenAI

Publications -  17
Citations -  17691

Rewon Child is an academic researcher from OpenAI. The author has contributed to research in topics: Language model & Recurrent neural network. The author has an hindex of 14, co-authored 15 publications receiving 4792 citations. Previous affiliations of Rewon Child include Baidu.

Papers
More filters
Posted Content

Scaling Laws for Neural Language Models

TL;DR: Larger models are significantly more sample-efficient, such that optimally compute-efficient training involves training very large models on a relatively modest amount of data and stopping significantly before convergence.
Posted Content

Generating Long Sequences with Sparse Transformers.

TL;DR: This paper introduces sparse factorizations of the attention matrix which reduce this to $O(n)$, and generates unconditional samples that demonstrate global coherence and great diversity, and shows it is possible in principle to use self-attention to model sequences of length one million or more.