scispace - formally typeset
Open AccessProceedings Article

Coupled Multi-Layer Attentions for Co-Extraction of Aspect and Opinion Terms.

TLDR
A novel deep learning model, named coupled multi-layer attentions, where each layer consists of a couple of attentions with tensor operators that are learned interactively to dually propagate information between aspect terms and opinion terms.
Abstract
The task of aspect and opinion terms co-extraction aims to explicitly extract aspect terms describing features of an entity and opinion terms expressing emotions from user-generated texts. To achieve this task, one effective approach is to exploit relations between aspect terms and opinion terms by parsing syntactic structure for each sentence. However, this approach requires expensive effort for parsing and highly depends on the quality of the parsing results. In this paper, we offer a novel deep learning model, named coupled multi-layer attentions. The proposed model provides an end-to-end solution and does not require any parsers or other linguistic resources for preprocessing. Specifically, the proposed model is a multilayer attention network, where each layer consists of a couple of attentions with tensor operators. One attention is for extracting aspect terms, while the other is for extracting opinion terms. They are learned interactively to dually propagate information between aspect terms and opinion terms. Through multiple layers, the model can further exploit indirect relations between terms for more precise information extraction. Experimental results on three benchmark datasets in SemEval Challenge 2014 and 2015 show that our model achieves stateof-the-art performances compared with several baselines.

read more

Citations
More filters
Journal ArticleDOI

Deep learning for sentiment analysis: A survey

TL;DR: Deep learning has emerged as a powerful machine learning technique that learns multiple layers of representations or features of the data and produces state-of-the-art prediction results as mentioned in this paper, which is also popularly used in sentiment analysis in recent years.
Journal ArticleDOI

A review on the attention mechanism of deep learning

TL;DR: An overview of the state-of-the-art attention models proposed in recent years is given and a unified model that is suitable for most attention structures is defined.
Journal ArticleDOI

Deep Learning for Aspect-Based Sentiment Analysis: A Comparative Review

TL;DR: This article aims to provide a comparative review of deep learning for aspect-based sentiment analysis to place different approaches in context.
Posted Content

An Attentive Survey of Attention Models

TL;DR: A taxonomy that groups existing techniques into coherent categories in attention models is proposed, and how attention has been used to improve the interpretability of neural networks is described.
Posted Content

BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis

TL;DR: A novel post-training approach on the popular language model BERT to enhance the performance of fine-tuning of BERT for RRC and is applied to some other review-based tasks such as aspect extraction and aspect sentiment classification in aspect-based sentiment analysis.
References
More filters
Posted Content

DRAW: A Recurrent Neural Network For Image Generation

TL;DR: The Deep Recurrent Attentive Writer neural network architecture for image generation substantially improves on the state of the art for generative models on MNIST, and, when trained on the Street View House Numbers dataset, it generates images that cannot be distinguished from real data with the naked eye.
Journal ArticleDOI

Opinion word expansion and target extraction through double propagation

TL;DR: This article study two important problems, namely, opinion lexicon expansion and opinion target extraction, and proposes a method based on bootstrapping that outperforms these existing methods significantly.
Proceedings Article

Ask me anything: dynamic memory networks for natural language processing

TL;DR: This paper introduced the dynamic memory network (DMN), a neural network architecture which processes input sequences and questions, forms episodic memories, and generates relevant answers, which can be trained end-to-end and obtains state-of-the-art results on several types of tasks and datasets.
Proceedings ArticleDOI

SemEval-2015 Task 12: Aspect Based Sentiment Analysis

TL;DR: The task provided manually annotated reviews in three domains (restaurants, laptops and hotels), and a common evaluation procedure, to foster research beyond sentenceor text-level sentiment classification towards Aspect Based Sentiment Analysis.
Proceedings Article

Jointly Modeling Aspects and Opinions with a MaxEnt-LDA Hybrid

TL;DR: This paper proposes a MaxEnt-LDA hybrid model to jointly discover both aspects and aspect-specific opinion words and shows that with a relatively small amount of training data, this model can effectively identify aspect and opinion words simultaneously.
Related Papers (5)