scispace - formally typeset
K

Kyunghyun Cho

Researcher at New York University

Publications -  351
Citations -  116609

Kyunghyun Cho is an academic researcher from New York University. The author has contributed to research in topics: Machine translation & Recurrent neural network. The author has an hindex of 77, co-authored 316 publications receiving 94919 citations. Previous affiliations of Kyunghyun Cho include Facebook & Université de Montréal.

Papers
More filters
Proceedings ArticleDOI

Augmentation for small object detection

TL;DR: This research project is about the importance of manual and exploratory testing in industry when the project is under develop stage.
Posted Content

Multi-Stage Document Ranking with BERT.

TL;DR: This work proposes two variants of BERT, called monoBERT and duoBERT, that formulate the ranking problem as pointwise and pairwise classification, respectively, arranged in a multi-stage ranking architecture to form an end-to-end search system.
Posted Content

Efficient Character-level Document Classification by Combining Convolution and Recurrent Layers

TL;DR: A neural network architecture that utilizes both convolution and recurrent layers to efficiently encode character inputs is proposed and validated on eight large scale document classification tasks and compared with character-level convolution-only models.
Journal ArticleDOI

Evaluation of Combined Artificial Intelligence and Radiologist Assessment to Interpret Screening Mammograms

Thomas Schaffter, +74 more
TL;DR: This diagnostic accuracy study evaluates whether artificial intelligence can overcome human mammography interpretation limits with a rigorous, unbiased evaluation of machine learning algorithms.
Proceedings ArticleDOI

Zero-Resource Translation with Multi-Lingual Neural Machine Translation

TL;DR: The authors proposed a finetuning algorithm for the recently introduced multi-way, mulitlingual NMT model that enables zero-resource machine translation, and empirically showed that the finetuned model can translate a zero resource language pair as well as a single-pair neural translation model trained with up to 1M direct parallel sentences.