scispace - formally typeset
Open AccessPosted Content

A C-LSTM Neural Network for Text Classification

Reads0
Chats0
TLDR
C-LSTM is a novel and unified model for sentence representation and text classification that outperforms both CNN and LSTM and can achieve excellent performance on these tasks.
Abstract
Neural network models have been demonstrated to be capable of achieving remarkable performance in sentence and document modeling. Convolutional neural network (CNN) and recurrent neural network (RNN) are two mainstream architectures for such modeling tasks, which adopt totally different ways of understanding natural languages. In this work, we combine the strengths of both architectures and propose a novel and unified model called C-LSTM for sentence representation and text classification. C-LSTM utilizes CNN to extract a sequence of higher-level phrase representations, and are fed into a long short-term memory recurrent neural network (LSTM) to obtain the sentence representation. C-LSTM is able to capture both local features of phrases as well as global and temporal sentence semantics. We evaluate the proposed architecture on sentiment classification and question classification tasks. The experimental results show that the C-LSTM outperforms both CNN and LSTM and can achieve excellent performance on these tasks.

read more

Citations
More filters
Book ChapterDOI

Attention-Based CNN-BLSTM Networks for Joint Intent Detection and Slot Filling

TL;DR: An attention-based encoder-decoder neural network model for joint intent detection and slot filling is presented, which encodes sentence representation with a hybrid Convolutional Neural Networks and Bidirectional Long Short-Term Memory Networks (CNN-BLSTM), and decodes it with an attention- based recurrent neural network with aligned inputs.
Proceedings ArticleDOI

A Temporal Sequence Learning for Action Recognition and Prediction

TL;DR: In this paper, each frame is converted into a word that is represented as a vector using the Bag of Visual Words (BoW) encoding method and then combined into a sentence to represent the video, as a sentence.
Journal ArticleDOI

Text Classification Using Long Short-Term Memory With GloVe Features

TL;DR: By tuning the parameters and comparing the eight proposed Long Short-Term Memory models with a large-scale dataset, it is shown that LSTM with features GloVe can achieve good performance in text classification.
Journal ArticleDOI

A text classification method based on a convolutional and bidirectional long short-term memory model

TL;DR: In this paper , a text classification method based on the CBM (Convolutional and Bi-LSTM Model) model, which can extract shallow local semantic features and deep global semantic features, is proposed.
Proceedings ArticleDOI

A C-LSTM with Word Embedding Model for News Text Classification

TL;DR: Experiments show that the model proposed in this paper has great advantages in Chinese news text classification, and using a C-LSTM with word embedding model to deal with this problem.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Proceedings ArticleDOI

Learning Phrase Representations using RNN Encoder--Decoder for Statistical Machine Translation

TL;DR: In this paper, the encoder and decoder of the RNN Encoder-Decoder model are jointly trained to maximize the conditional probability of a target sequence given a source sequence.
Proceedings Article

Rectified Linear Units Improve Restricted Boltzmann Machines

TL;DR: Restricted Boltzmann machines were developed using binary stochastic hidden units that learn features that are better for object recognition on the NORB dataset and face verification on the Labeled Faces in the Wild dataset.
Posted Content

Sequence to Sequence Learning with Neural Networks

TL;DR: This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
Related Papers (5)