scispace - formally typeset
P

Peter Shaw

Researcher at Ohio State University

Publications -  22
Citations -  1458

Peter Shaw is an academic researcher from Ohio State University. The author has contributed to research in topics: Parsing & Computer science. The author has an hindex of 8, co-authored 17 publications receiving 936 citations. Previous affiliations of Peter Shaw include Google.

Papers
More filters
Proceedings ArticleDOI

Self-Attention with Relative Position Representations

TL;DR: This article extended the self-attention mechanism to consider representations of the relative positions, or distances between sequence elements, and showed that relative and absolute position representations yields no further improvement in translation quality.
Posted Content

Self-Attention with Relative Position Representations

TL;DR: This work presents an alternative approach, extending the self-attention mechanism to efficiently consider representations of the relative positions, or distances between sequence elements, on the WMT 2014 English-to-German and English- to-French translation tasks.
Posted Content

Compositional Generalization and Natural Language Variation: Can a Semantic Parsing Approach Handle Both?

TL;DR: NQG-T5 is proposed, a hybrid model that combines a high-precision grammar-based approach with a pre-trained sequence-to-sequence model that outperforms existing approaches across several compositional generalization challenges on non-synthetic data, while also being competitive with the state of theart on standard evaluations.
Proceedings ArticleDOI

Exploring Unexplored Generalization Challenges for Cross-Database Semantic Parsing

TL;DR: This work re-purpose eight semantic parsing datasets that have been well-studied in the setting where in-domain training data is available, and instead use them as additional evaluation data for XSP systems instead, to uncovers several generalization challenges for cross-database semantic parsing.
Proceedings ArticleDOI

Generating Logical Forms from Graph Representations of Text and Entities

TL;DR: This paper used a Graph Neural Network (GNN) architecture to incorporate information about relevant entities and their relations during parsing, and combined with a decoder copy mechanism, this approach provided a conceptually simple mechanism to generate logical forms with entities.