J
Joey Liu
Researcher at Philips
Publications - 34
Citations - 1104
Joey Liu is an academic researcher from Philips. The author has contributed to research in topics: Deep learning & Question answering. The author has an hindex of 14, co-authored 34 publications receiving 917 citations.
Papers
More filters
Proceedings Article
Neural Paraphrase Generation with Stacked Residual LSTM Networks
Aaditya Prakash,Sadid A. Hasan,Kathy Lee,Vivek V. Datla,Ashequl Qadir,Joey Liu,Oladimeji Farri +6 more
TL;DR: The authors proposed a stacked residual LSTM network for paraphrase generation, which adds residual connections between LSTMs layers for efficient training, and achieved state-of-the-art performance on three different datasets: PPDB, WikiAnswers and MSCOCO.
Proceedings ArticleDOI
Adverse Drug Event Detection in Tweets with Semi-Supervised Convolutional Neural Networks
Kathy Lee,Ashequl Qadir,Sadid A. Hasan,Vivek V. Datla,Aaditya Prakash,Joey Liu,Oladimeji Farri +6 more
TL;DR: This work builds several semi-supervised convolutional neural network models for ADE classification in tweets, specifically leveraging different types of unlabeled data in developing the models to address the problem.
VQA-Med: Overview of the Medical Visual Question Answering Task at ImageCLEF 2019.
TL;DR: This paper presents an overview of the Medical Visual Question Answering task (VQA-Med) at ImageCLEF 2019, and focuses on four categories of clinical questions: Modality, Plane, Organ System, and Abnormality.
Posted Content
DR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference
Reza Ghaeini,Sadid A. Hasan,Vivek V. Datla,Joey Liu,Kathy Lee,Ashequl Qadir,Yuan Ling,Aaditya Prakash,Xiaoli Z. Fern,Oladimeji Farri +9 more
TL;DR: A novel dependent reading bidirectional LSTM network (DR-BiLSTM) is proposed to efficiently model the relationship between a premise and a hypothesis during encoding and inference in the natural language inference (NLI) task.
Posted Content
Neural Paraphrase Generation with Stacked Residual LSTM Networks
Aaditya Prakash,Sadid A. Hasan,Kathy Lee,Vivek V. Datla,Ashequl Qadir,Joey Liu,Oladimeji Farri +6 more
TL;DR: This work is the first to explore deep learning models for paraphrase generation with a stacked residual LSTM network, where it adds residual connections between L STM layers for efficient training of deep LSTMs.