Open AccessProceedings Article
Coupled Multi-Layer Attentions for Co-Extraction of Aspect and Opinion Terms.
Wenya Wang,Sinno Jialin Pan,Daniel Dahlmeier,Xiaokui Xiao +3 more
- pp 3316-3322
TLDR
A novel deep learning model, named coupled multi-layer attentions, where each layer consists of a couple of attentions with tensor operators that are learned interactively to dually propagate information between aspect terms and opinion terms.Abstract:
The task of aspect and opinion terms co-extraction aims to explicitly extract aspect terms describing features of an entity and opinion terms expressing emotions from user-generated texts. To achieve this task, one effective approach is to exploit relations between aspect terms and opinion terms by parsing syntactic structure for each sentence. However, this approach requires expensive effort for parsing and highly depends on the quality of the parsing results. In this paper, we offer a novel deep learning model, named coupled multi-layer attentions. The proposed model provides an end-to-end solution and does not require any parsers or other linguistic resources for preprocessing. Specifically, the proposed model is a multilayer attention network, where each layer consists of a couple of attentions with tensor operators. One attention is for extracting aspect terms, while the other is for extracting opinion terms. They are learned interactively to dually propagate information between aspect terms and opinion terms. Through multiple layers, the model can further exploit indirect relations between terms for more precise information extraction. Experimental results on three benchmark datasets in SemEval Challenge 2014 and 2015 show that our model achieves stateof-the-art performances compared with several baselines.read more
Citations
More filters
Journal ArticleDOI
CASA: Conversational Aspect Sentiment Analysis for Dialogue Understanding
TL;DR: The task of conversational aspect sentiment analysis (CASA) is introduced that can provide useful fine-grained sentiment information for dialogue understanding and planning and extends the standard aspect-based sentiment analysis to the conversational scenario with several major adaptations.
Posted Content
Disentangling Overlapping Beliefs by Structured Matrix Factorization.
Chaoqi Yang,Jinyang Li,Ruijie Wang,Shuochao Yao,Huajie Shao,Dongxin Liu,Shengzhong Liu,Tianshi Wang,Tarek Abdelzaher +8 more
TL;DR: A new class of Non-negative Matrix Factorization algorithms that allow identification of both agreement and disagreement points when beliefs of different communities partially overlap, and shows that social beliefs overlap even in polarized scenarios.
Posted Content
Neural ranking models for document retrieval.
TL;DR: A variety of deep learning models have been proposed, and each model presents a set of neural network components to extract features that are used for ranking as discussed by the authors, and they have been compared along different dimensions in order to understand the major contributions and limitations of each model.
Journal ArticleDOI
Dependency graph enhanced interactive attention network for aspect sentiment triplet extraction
TL;DR: This article proposed an interactive attention mechanism to jointly consider both the contextual features learned from Bi-directional Long Short-Term Memory and the syntactic dependencies learned from the correspondent dependency graph in an iterative interaction manner.
Journal ArticleDOI
PyABSA: Open Framework for Aspect-based Sentiment Analysis
Heng Yang,Ke-qiang Li +1 more
TL;DR: An open-source ABSA framework, namely P Y ABSA, which includes the features of aspect term extraction, aspect sentiment classification, and text classi-cation and to facilitate ABSA applications, which is helpful in deploying ABSA services.
References
More filters
Proceedings Article
Distributed Representations of Words and Phrases and their Compositionality
TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Proceedings Article
Neural Machine Translation by Jointly Learning to Align and Translate
TL;DR: It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Proceedings ArticleDOI
Learning Phrase Representations using RNN Encoder--Decoder for Statistical Machine Translation
Kyunghyun Cho,Bart van Merriënboer,Caglar Gulcehre,Dzmitry Bahdanau,Fethi Bougares,Holger Schwenk,Yoshua Bengio,Yoshua Bengio,Yoshua Bengio +8 more
TL;DR: In this paper, the encoder and decoder of the RNN Encoder-Decoder model are jointly trained to maximize the conditional probability of a target sequence given a source sequence.
Posted Content
Neural Machine Translation by Jointly Learning to Align and Translate
TL;DR: In this paper, the authors propose to use a soft-searching model to find the parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Book
Opinion Mining and Sentiment Analysis
Bo Pang,Lillian Lee +1 more
TL;DR: This survey covers techniques and approaches that promise to directly enable opinion-oriented information-seeking systems and focuses on methods that seek to address the new challenges raised by sentiment-aware applications, as compared to those that are already present in more traditional fact-based analysis.