Book ChapterDOI
Using Deep Learning Based Natural Language Processing Techniques for Clinical Decision-Making with EHRs
Runjie Zhu,Xinhui Tu,Jimmy Xiangji Huang +2 more
- pp 257-295
TLDR
It is found that the distance to revolutionize the existing healthcare sector using deep learning methods still remains long, but the recent progress made by these proposed methods have already made a promising good start.Abstract:
Natural language processing (NLP) is an interdisciplinary domain of research that focuses on the interactions between human languages and computers. There has been a recent trend of solving the NLP problems using deep learning approach. The applications of deep learning in the healthcare sector are mostly considered to be related to canonical examples of applying image processing and computer vision techniques to medical scans for disease diagnoses. Electronic Health Record (EHR) is another source of data often being neglected, equally if not more important than medical scans, that can change the way we learn useful features and information from the medical records of patients. These text-based information stored within the EHR are data-rich by nature, but are often not well-understood due to its characteristics of high volume, variety, velocity and complexity. However, these specific characteristics fit right to the nature of deep learning. Therefore, we believe it is the right time to summarize the current status, to review and learn from the state-of-the-art medical-based NLP techniques. Different from the existing reviews, we examine and categorize the current deep learning-based NLP techniques in medical domain into three major purposes: representation learning, information extraction and clinical predictions. Meanwhile, we discuss whether the application of deep learning methods has tackled the problems differently and transformed these tasks revolutionarily. Based on the results, we find that the distance to revolutionize the existing healthcare sector using deep learning methods still remains long. However, the recent progress made by these proposed methods have already made a promising good start. Furthermore, we state some of the legal and ethical considerations, present the status quo of the healthcare industry applications, and provide several possible directions of future research.read more
Citations
More filters
Journal Article
Medical semantic similarity with a neural language model
TL;DR: The demonstrated superiority of this model for providing an effective semantic similarity measure is promising in that this may translate into effectiveness gains for techniques in medical information retrieval and medical informatics (e.g., query expansion and literature-based discovery).
Journal ArticleDOI
A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data
TL;DR: A hybrid of convolutional neural network (CNN) and bidirectional long short-term memory (BiLSTM) is used, which does automatic feature extraction from the raw sensor data with minimal data pre-processing and outperforms the other compared approaches.
Journal Article
Realizing the full potential of electronic health records
TL;DR: This issue of the journal displays several solutions to this problem that are based on natural language processing (NLP) techniques, and the need to steer current NLP research efforts so that new developments can be accelerated and research products can become readily usable in healthcare applications.
Journal ArticleDOI
Defining Patient-Oriented Natural Language Processing: A New Paradigm for Research and Development to Facilitate Adoption and Use by Medical Experts
Abeed Sarker,Mohammed Ali Al-Garadi,Yuan-Chi Yang,Jinho Choi,Arshed A. Quyyumi,Greg S. Martin +5 more
TL;DR: In this paper, the authors present a viewpoint about four interrelated characteristics that can increase NLP systems' suitability for POCRC (3 that represent NLP system properties and 1 associated with the R&D process)-(1) interpretability (the ability to explain system decisions), (2) patient centeredness (the capability to characterize diverse patients), and (4) multitask evaluation.
Journal ArticleDOI
Use of AI/ML-enabled state-of-the-art method in electronic medical records: A systematic review
Elias Hossain,Rajib Rana,Niall Higgins,Jeffrey Soar,Prabal Datta Barua,Anthony R. Pisani,Kathryn Turner +6 more
TL;DR: In this article , Wang et al. compared Machine Learning (ML), Deep Learning (DL) and NLP techniques to understand the limitations and opportunities in this space comprehensively and found that the adopted ML models were not adequately assessed.
References
More filters
Proceedings Article
ImageNet Classification with Deep Convolutional Neural Networks
TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI
ImageNet classification with deep convolutional neural networks
TL;DR: A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Proceedings ArticleDOI
Glove: Global Vectors for Word Representation
TL;DR: A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
Posted Content
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
TL;DR: A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.
Related Papers (5)
Deep learning for electronic health records: A comparative review of multiple deep neural architectures
Jose Roberto Ayala Solares,Jose Roberto Ayala Solares,Francesca Raimondi,Yajie Zhu,Fatemeh Rahimian,Dexter Canoy,Dexter Canoy,Dexter Canoy,Jenny Tran,Ana Catarina Pinho Gomes,Amir H. Payberah,Mariagrazia Zottoli,Milad Nazarzadeh,Nathalie Conrad,Kazem Rahimi,Kazem Rahimi,Gholamreza Salimi-Khorshidi +16 more
Opportunities and obstacles for deep learning in biology and medicine.
Travers Ching,Daniel Himmelstein,Brett K. Beaulieu-Jones,Alexandr A. Kalinin,Brian T. Do,Gregory P. Way,Enrico Ferrero,Paul-Michael Agapow,Michael Zietz,Michael M. Hoffman,Michael M. Hoffman,Wei Xie,Gail L. Rosen,Benjamin J. Lengerich,Johnny Israeli,Jack Lanchantin,Stephen Woloszynek,Anne E. Carpenter,Avanti Shrikumar,Jinbo Xu,Evan M. Cofer,Evan M. Cofer,Christopher A. Lavender,Srinivas C. Turaga,Amr Alexandari,Zhiyong Lu,David J. Harris,Dave DeCaprio,Yanjun Qi,Anshul Kundaje,Yifan Peng,Laura K. Wiley,Marwin H. S. Segler,Simina M. Boca,S. Joshua Swamidass,Austin Huang,Anthony Gitter,Anthony Gitter,Casey S. Greene +38 more