scispace - formally typeset
L

Luke Zettlemoyer

Researcher at Facebook

Publications -  344
Citations -  65369

Luke Zettlemoyer is an academic researcher from Facebook. The author has contributed to research in topics: Computer science & Parsing. The author has an hindex of 82, co-authored 278 publications receiving 40896 citations. Previous affiliations of Luke Zettlemoyer include Princeton University & Massachusetts Institute of Technology.

Papers
More filters
Proceedings Article

Whose Language Counts as High Quality? Measuring Language Ideologies in Text Data Selection

TL;DR: It is argued that more care is needed to construct training corpora for language models with better transparency and justification for the inclusion or exclusion of various texts, and that privileging any corpus as high quality entails a language ideology.
Proceedings ArticleDOI

E3: Entailment-driven Extracting and Editing for Conversational Machine Reading.

TL;DR: Zhang et al. as mentioned in this paper proposed an Entailment-driven Extract and Edit network (E3) to extract decision rules from the procedural text while reasoning about which are entailed by the conversational history and which still need to be edited to create questions for the user.
Journal ArticleDOI

Scaling Laws for Generative Mixed-Modal Language Models

TL;DR: In this article , the optimal synergy and competition due to data and model size is modeled as an additive term to previous uni-modal scaling laws, which unify the contributions of individual modalities and the interactions between them.
Posted Content

Intrinsic Dimensionality Explains the Effectiveness of Language Model Fine-Tuning

TL;DR: The authors empirically show that common pre-trained models have a very low intrinsic dimension; in other words, there exists a low dimension reparameterization that is as effective for fine-tuning as the full parameter space.