Open AccessPosted Content
A C-LSTM Neural Network for Text Classification
Reads0
Chats0
TLDR
C-LSTM is a novel and unified model for sentence representation and text classification that outperforms both CNN and LSTM and can achieve excellent performance on these tasks.Abstract:
Neural network models have been demonstrated to be capable of achieving remarkable performance in sentence and document modeling. Convolutional neural network (CNN) and recurrent neural network (RNN) are two mainstream architectures for such modeling tasks, which adopt totally different ways of understanding natural languages. In this work, we combine the strengths of both architectures and propose a novel and unified model called C-LSTM for sentence representation and text classification. C-LSTM utilizes CNN to extract a sequence of higher-level phrase representations, and are fed into a long short-term memory recurrent neural network (LSTM) to obtain the sentence representation. C-LSTM is able to capture both local features of phrases as well as global and temporal sentence semantics. We evaluate the proposed architecture on sentiment classification and question classification tasks. The experimental results show that the C-LSTM outperforms both CNN and LSTM and can achieve excellent performance on these tasks.read more
Citations
More filters
Proceedings ArticleDOI
Hierarchical Attention Networks for Document Classification
TL;DR: Experiments conducted on six large scale text classification tasks demonstrate that the proposed architecture outperform previous methods by a substantial margin.
Journal ArticleDOI
A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures
TL;DR: The LSTM cell and its variants are reviewed and their variants are explored to explore the learning capacity of the LSTm cell and the L STM networks are divided into two broad categories:LSTM-dominated networks and integrated LSTS networks.
Journal ArticleDOI
Predicting residential energy consumption using CNN-LSTM neural networks
Tae Young Kim,Sung-Bae Cho +1 more
TL;DR: This paper proposes a CNN-LSTM neural network that can extract spatial and temporal features to effectively predict the housing energy consumption and achieves almost perfect prediction performance for electric energy consumption that was previously difficult to predict.
Journal ArticleDOI
Text Classification Algorithms: A Survey
Kamran Kowsari,Kiana Jafari Meimandi,Mojtaba Heidarysafa,Sanjana Mendu,Laura E. Barnes,Donald E. Brown +5 more
TL;DR: An overview of text classification algorithms is discussed, which covers different text feature extractions, dimensionality reduction methods, existing algorithms and techniques, and evaluations methods.
Journal ArticleDOI
Bidirectional LSTM with attention mechanism and convolutional layer for text classification
Gang Liu,Jiabao Guo +1 more
TL;DR: A novel and unified architecture which contains a bidirectional LSTM (BiLSTM), attention mechanism and the convolutional layer is proposed in this paper, which outperforms other state-of-the-art text classification methods in terms of the classification accuracy.
References
More filters
Posted Content
When Are Tree Structures Necessary for Deep Learning of Representations
TL;DR: The authors show that recursive neural models can outperform simple recurrent neural networks (LSTM and LSTM) on several tasks, such as sentiment classification at the sentence level and phrase level, matching questions to answer-phrases, discourse parsing and semantic relation extraction.
Proceedings ArticleDOI
Molding CNNs for text: non-linear, non-consecutive convolutions
TL;DR: This work revise the temporal convolution operation in CNNs to better adapt it to text processing by appealing to tensor algebra and using low-rank n-gram tensors to directly exploit interactions between words already at the convolution stage.
Posted Content
Self-Adaptive Hierarchical Sentence Model
TL;DR: Both qualitative and quantitative analysis shows that AdaSent can automatically form and select the representations suitable for the task at hand during training, yielding superior classification performance over competitor models on 5 benchmark data sets.
Proceedings ArticleDOI
Discriminative Neural Sentence Modeling by Tree-Based Convolution
TL;DR: This paper proposed a tree-based convolutional neural network (TBCNN), which leverages constituency trees or dependency trees of sentences to extract sentences structural features, which are then aggregated by max pooling.
Posted Content
Modelling‚ Visualising and Summarising Documents with a Single Convolutional Neural Network
TL;DR: A model is introduced that is able to represent the meaning of documents by embedding them in a low dimensional vector space, while preserving distinctions of word and sentence order crucial for capturing nuanced semantics.