scispace - formally typeset
O

Oladimeji Farri

Researcher at Philips

Publications -  69
Citations -  1423

Oladimeji Farri is an academic researcher from Philips. The author has contributed to research in topics: Deep learning & Artificial neural network. The author has an hindex of 18, co-authored 69 publications receiving 1163 citations. Previous affiliations of Oladimeji Farri include University of Minnesota.

Papers
More filters
Proceedings Article

Neural Paraphrase Generation with Stacked Residual LSTM Networks

TL;DR: The authors proposed a stacked residual LSTM network for paraphrase generation, which adds residual connections between LSTMs layers for efficient training, and achieved state-of-the-art performance on three different datasets: PPDB, WikiAnswers and MSCOCO.
Journal ArticleDOI

Comparative effectiveness of convolutional neural network (CNN) and recurrent neural network (RNN) architectures for radiology text report classification.

TL;DR: Investigation of cutting-edge deep learning methods for information extraction from medical imaging free text reports at a multi-institutional scale and compares them to the state-of-the-art domain-specific rule-based system - PEFinder and traditional machine learning methods - SVM and Adaboost suggests feasibility of broader usage of neural network models in automated classification of multi-Institutional imaging text reports.
Proceedings ArticleDOI

Adverse Drug Event Detection in Tweets with Semi-Supervised Convolutional Neural Networks

TL;DR: This work builds several semi-supervised convolutional neural network models for ADE classification in tweets, specifically leveraging different types of unlabeled data in developing the models to address the problem.
Posted Content

DR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference

TL;DR: A novel dependent reading bidirectional LSTM network (DR-BiLSTM) is proposed to efficiently model the relationship between a premise and a hypothesis during encoding and inference in the natural language inference (NLI) task.
Posted Content

Neural Paraphrase Generation with Stacked Residual LSTM Networks

TL;DR: This work is the first to explore deep learning models for paraphrase generation with a stacked residual LSTM network, where it adds residual connections between L STM layers for efficient training of deep LSTMs.