scispace - formally typeset
Proceedings ArticleDOI

Deep Sentiment Representation Based on CNN and LSTM

Reads0
Chats0
TLDR
The model uses the pre-trained word vectors as input and employs CNN to gain significant local features of the text, then features are fed to two-layer LSTMs, which can extract context-dependent features and generate sentence representation for sentiment classification.
Abstract
Traditional machine learning techniques, including support vector machine (SVM), random walk, and so on, have been applied in various tasks of text sentiment analysis, which makes poor generalization ability in terms of complex classification problem. In recent years, deep learning has made a breakthrough in the research of Natural Language Processing. Convolutional neural network (CNN) and recurrent neural networks (RNNs) are two mainstream methods of deep learning in document and sentence modeling. In this paper, a model of capturing deep sentiment representation based on CNN and long short-term memory recurrent neural network (LSTM) is proposed. The model uses the pre-trained word vectors as input and employs CNN to gain significant local features of the text, then features are fed to two-layer LSTMs, which can extract context-dependent features and generate sentence representation for sentiment classification. We evaluate the proposed model by conducting a series of experiments on dataset. The experimental results show that the model we designed outperforms existing CNN, LSTM, CNN-LSTM (our implement of one-layer LSTM directly stacked on one-layer CNN) and SVM (support vector machine).

read more

Citations
More filters
Journal ArticleDOI

I and i

Kevin Barraclough
- 08 Dec 2001 - 
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Journal ArticleDOI

Sentiment analysis using deep learning architectures: a review

TL;DR: This paper provides a detailed survey of popular deep learning models that are increasingly applied in sentiment analysis and presents a taxonomy of sentiment analysis, which highlights the power of deep learning architectures for solving sentiment analysis problems.
Journal ArticleDOI

Deep learning CNN–LSTM framework for Arabic sentiment analysis using textual information shared in social networks

TL;DR: A novel deep learning model is proposed for Arabic language sentiment analysis based on one layer CNN architecture for local feature extraction, and two layers LSTM to maintain long-term dependencies to outperform state-of-the-art approaches on relevant corpora.
Journal ArticleDOI

Network text sentiment analysis method combining LDA text representation and GRU-CNN

TL;DR: A text sentiment analysis method combining Latent Dirichlet Allocation text representation and convolutional neural network (CNN) that can effectively improve the accuracy of text sentiment classification.
Journal ArticleDOI

A Hybrid CNN-LSTM Model for Aircraft 4D Trajectory Prediction

TL;DR: A novel 4D trajectory prediction hybrid architecture based on deep learning, which combined Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) is proposed, which shows that the trajectory prediction accuracy of the CNN-L STM hybrid model is superior to a single model.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Journal ArticleDOI

I and i

Kevin Barraclough
- 08 Dec 2001 - 
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Proceedings Article

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Proceedings ArticleDOI

Convolutional Neural Networks for Sentence Classification

TL;DR: The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors.