scispace - formally typeset
K

Kevin Duh

Researcher at Johns Hopkins University

Publications -  205
Citations -  6391

Kevin Duh is an academic researcher from Johns Hopkins University. The author has contributed to research in topics: Machine translation & Parsing. The author has an hindex of 38, co-authored 205 publications receiving 5369 citations. Previous affiliations of Kevin Duh include University of Washington & Nara Institute of Science and Technology.

Papers
More filters
Proceedings ArticleDOI

Representation Learning Using Multi-Task Deep Neural Networks for Semantic Classification and Information Retrieval

TL;DR: This work develops a multi-task DNN for learning representations across multiple tasks, not only leveraging large amounts of cross-task data, but also benefiting from a regularization effect that leads to more general representations to help tasks in new domains.
Posted Content

DyNet: The Dynamic Neural Network Toolkit

TL;DR: DyNet is a toolkit for implementing neural network models based on dynamic declaration of network structure that has an optimized C++ backend and lightweight graph representation and is designed to allow users to implement their models in a way that is idiomatic in their preferred programming language.
Proceedings Article

Automatic Evaluation of Translation Quality for Distant Language Pairs

TL;DR: An automatic evaluation metric based on rank correlation coefficients modified with precision is proposed and meta-evaluation of the NTCIR-7 PATMT JE task data shows that this metric outperforms conventional metrics.
Posted Content

ReCoRD: Bridging the Gap between Human and Machine Commonsense Reading Comprehension.

TL;DR: This work presents a large-scale dataset, ReCoRD, for machine reading comprehension requiring commonsense reasoning, and demonstrates that the performance of state-of-the-art MRC systems fall far behind human performance.
Proceedings ArticleDOI

Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning

TL;DR: The authors explore weight pruning for BERT and find that low levels of pruning (30-40%) do not affect pre-training loss or transfer to downstream tasks at all.