scispace - formally typeset
X

Xi Victoria Lin

Publications -  12
Citations -  881

Xi Victoria Lin is an academic researcher. The author has contributed to research in topics: Computer science & Paleontology. The author has an hindex of 7, co-authored 12 publications receiving 881 citations.

Papers
More filters
Journal Article

OPT: Open Pre-trained Transformer Language Models

TL;DR: This work presents Open Pre-trained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters, which they aim to fully and responsibly share with interested researchers.
Proceedings ArticleDOI

Lifting the Curse of Multilinguality by Pre-training Modular Transformers

TL;DR: The authors pre-train the modules of cross-lingual modular models from the start, which not only mitigates the negative interference between languages, but also enables positive transfer.
Proceedings Article

Few-shot Learning with Multilingual Generative Language Models

TL;DR: The authors train multilingual generative language models on a corpus covering a diverse set of languages, and study their few-and zero-shot learning capabilities in a wide range of tasks, including commonsense reasoning and natural language inference.
Journal ArticleDOI

LEVER: Learning to Verify Language-to-Code Generation with Execution

TL;DR: LeVER as discussed by the authors learns to verify the generated programs with their execution results by training verifiers to determine whether a program sampled from the LLMs is correct or not based on the natural language input, the program itself and its execution results.
Journal ArticleDOI

FOLIO: Natural Language Reasoning with First-Order Logic

TL;DR: The results show that one of the most capable Large Language Model (LLM) publicly available, GPT-3 davinci, achieves only slightly better than random results with few-shot prompting on a subset of FOLIO, and the model is especially bad at predicting the correct truth values for False and Unknown conclusions.