scispace - formally typeset
B

Brian DuSell

Researcher at University of Notre Dame

Publications -  8
Citations -  11

Brian DuSell is an academic researcher from University of Notre Dame. The author has contributed to research in topics: Nondeterministic algorithm & Computer science. The author has an hindex of 1, co-authored 5 publications receiving 2 citations.

Papers
More filters
Proceedings ArticleDOI

Learning Context-free Languages with Nondeterministic Stack RNNs

TL;DR: In this paper, a differentiable stack data structure is proposed to simultaneously encode an exponential number of stack configurations, based on Lang's algorithm for simulating non-deterministic pushdown automata.
Proceedings ArticleDOI

Algorithms for Weighted Pushdown Automata

TL;DR: Novel algorithms that operate directly on WPDAs are developed that are inspired by Lang’s algorithm, but use a more general definition of pushdown automaton and either reduce the space requirements by a factor of | Γ | (the size of the stack alphabet) or reduce the runtime by more than | 𝑄 | ( the number of states).
Proceedings ArticleDOI

Efficiency through Auto-Sizing: Notre Dame NLP’s Submission to the WNGT 2019 Efficiency Task

TL;DR: This paper investigated the impact of auto-sizing to the Transformer network with the goal of substantially reducing the number of parameters in the model, which was able to eliminate more than 25% of the model's parameters while suffering a decrease in BLEU.
Proceedings ArticleDOI

The Surprising Computational Power of Nondeterministic Stack RNNs

Brian DuSell
TL;DR: This paper shows that nondeterminism and the neural controller interact to produce two more unexpected abilities of the nondeterministic stack RNN, which can recognize not only CFLs, but also many non-context-free languages.
Posted Content

Efficiency through Auto-Sizing: Notre Dame NLP's Submission to the WNGT 2019 Efficiency Task.

TL;DR: This paper investigated the impact of auto-sizing to the Transformer network and was able to eliminate more than 25% of the model’s parameters while suffering a decrease of only 1.1 BLEU.