scispace - formally typeset
Search or ask a question
Author

Heng Yang

Bio: Heng Yang is an academic researcher from South China Normal University. The author has contributed to research in topics: Computer science & Sentiment analysis. The author has an hindex of 3, co-authored 10 publications receiving 79 citations.

Papers
More filters
Journal ArticleDOI
Biqing Zeng, Heng Yang, Ruyang Xu, Wu Zhou, Xuli Han 
TL;DR: A Local Context Focus (LCF) mechanism is proposed for aspect-based sentiment classification based on Multi-head Self-Attention (MHSA), and utilizes the Context features Dynamic Mask (CDM) and Context Features Dynamic Weighted (CDW) layers to pay more attention to the local context words.
Abstract: Aspect-based sentiment classification (ABSC) aims to predict sentiment polarities of different aspects within sentences or documents. Many previous studies have been conducted to solve this problem, but previous works fail to notice the correlation between the aspect’s sentiment polarity and the local context. In this paper, a Local Context Focus (LCF) mechanism is proposed for aspect-based sentiment classification based on Multi-head Self-Attention (MHSA). This mechanism is called LCF design, and utilizes the Context features Dynamic Mask (CDM) and Context Features Dynamic Weighted (CDW) layers to pay more attention to the local context words. Moreover, a BERT-shared layer is adopted to LCF design to capture internal long-term dependencies of local context and global context. Experiments are conducted on three common ABSC datasets: the laptop and restaurant datasets of SemEval-2014 and the ACL twitter dataset. Experimental results demonstrate that the LCF baseline model achieves considerable performance. In addition, we conduct ablation experiments to prove the significance and effectiveness of LCF design. Especially, by incorporating with BERT-shared layer, the LCF-BERT model refreshes state-of-the-art performance on all three benchmark datasets.

127 citations

Journal ArticleDOI
Heng Yang1, Biqing Zeng1, Jianhao Yang1, Youwei Song2, Ruyang Xu1 
TL;DR: A multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC, which equips the capability of extracting aspect term and inferring aspect term polarity synchronously and is effective to analyze both Chinese and English comments simultaneously.

51 citations

Posted Content
Heng Yang1, Biqing Zeng1, Jianhao Yang1, Youwei Song2, Ruyang Xu1 
TL;DR: Based on the local context focus (LCF) mechanism, this article proposed a multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC.
Abstract: Aspect-based sentiment analysis (ABSA) task is a multi-grained task of natural language processing and consists of two subtasks: aspect term extraction (ATE) and aspect polarity classification (APC). Most of the existing work focuses on the subtask of aspect term polarity inferring and ignores the significance of aspect term extraction. Besides, the existing researches do not pay attention to the research of the Chinese-oriented ABSA task. Based on the local context focus (LCF) mechanism, this paper firstly proposes a multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC. Compared with existing models, this model equips the capability of extracting aspect term and inferring aspect term polarity synchronously, moreover, this model is effective to analyze both Chinese and English comments simultaneously and the experiment on a multilingual mixed dataset proved its availability. By integrating the domain-adapted BERT model, the LCF-ATEPC model achieved the state-of-the-art performance of aspect term extraction and aspect polarity classification in four Chinese review datasets. Besides, the experimental results on the most commonly used SemEval-2014 task4 Restaurant and Laptop datasets outperform the state-of-the-art performance on the ATE and APC subtask.

39 citations

Journal ArticleDOI
TL;DR: This article propose a concept of dependency cluster and design two modules named Dynamic Local Context Focus (DLCF) and Dependency Cluster Attention (DCA) respectively, which can dynamically capture the range of local context based on the different max distance from the target aspect term to its context words and allow the model to pay more attention to the cluster which is more critical for sentiment classification.

13 citations

Journal ArticleDOI
Biqing Zeng1, Xuli Han1, Feng Zeng1, Ruyang Xu1, Heng Yang1 
TL;DR: This work proposes a multifeature interactive fusion model for aspect-based sentiment analysis that has a better performance compared with the baseline models and applies the attention mechanism to calculate fusion weight of features, so that the key features information plays a more significant role in the sentiment analysis.
Abstract: Aspect-based sentiment analysis (ABSA) is a fine-grained sentiment analysis technology. In recent years, neural networks are widely used to extract features of aspects and contexts and proven to have a dramatic improvement in retrieving the sentiment feature from comments. However, due to the increasing complexity of comment information, only considering sentence or word features, respectively, may cause the loss of key text information. Besides, characters have more microscopic features, so the fusion of features between three different levels, such as sentences, words, and characters, should be taken into consideration for exploring their internal relationship among different granularities. According to the above analysis, we propose a multifeature interactive fusion model for aspect-based sentiment analysis. Firstly, the text is divided into two parts: contexts and aspects; then word embedding and character embedding are associated to further explore the potential features. Secondly, to establish a close relationship between contexts and aspects, features fusion of both aspects and contexts are exploited in our model. Moreover, we apply the attention mechanism to calculate fusion weight of features, so that the key features information plays a more significant role in the sentiment analysis. Finally, we experimented on the three datasets of SemEval2014. The results of experiment showed that our model has a better performance compared with the baseline models.

8 citations


Cited by
More filters
Proceedings ArticleDOI
01 Jul 2020
TL;DR: The grammatical aspect of the sentence is explored and employs the self-attention mechanism for syntactical learning and the syntactic relative distance is proposed to de-emphasize the adverse effects of unrelated words, having weak syntactic connection with the aspect terms.
Abstract: The aspect-based sentiment analysis (ABSA) consists of two conceptual tasks, namely an aspect extraction and an aspect sentiment classification. Rather than considering the tasks separately, we build an end-to-end ABSA solution. Previous works in ABSA tasks did not fully leverage the importance of syntactical information. Hence, the aspect extraction model often failed to detect the boundaries of multi-word aspect terms. On the other hand, the aspect sentiment classifier was unable to account for the syntactical correlation between aspect terms and the context words. This paper explores the grammatical aspect of the sentence and employs the self-attention mechanism for syntactical learning. We combine part-of-speech embeddings, dependency-based embeddings and contextualized embeddings (e.g. BERT, RoBERTa) to enhance the performance of the aspect extractor. We also propose the syntactic relative distance to de-emphasize the adverse effects of unrelated words, having weak syntactic connection with the aspect terms. This increases the accuracy of the aspect sentiment classifier. Our solutions outperform the state-of-the-art models on SemEval-2014 dataset in both two subtasks.

96 citations

Journal ArticleDOI
Heng Yang1, Biqing Zeng1, Jianhao Yang1, Youwei Song2, Ruyang Xu1 
TL;DR: A multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC, which equips the capability of extracting aspect term and inferring aspect term polarity synchronously and is effective to analyze both Chinese and English comments simultaneously.

51 citations

Journal ArticleDOI
TL;DR: The proposed knowledge guided capsule network (KGCapsAN) implements the routing method by attention mechanism, and the results show that the proposed method yields the state-of-the-art.
Abstract: Aspect-based (aspect-level) sentiment analysis is an important task in fine-grained sentiment analysis, which aims to automatically infer the sentiment towards an aspect in its context Previous studies have shown that utilizing the attention-based method can effectively improve the accuracy of the aspect-based sentiment analysis Despite the outstanding progress, aspect-based sentiment analysis in the real-world remains several challenges (1) The current attention-based method may cause a given aspect to incorrectly focus on syntactically unrelated words (2) Conventional methods fail to identify the sentiment with the special sentence structure, such as double negatives (3) Most of the studies leverage only one vector to represent context and target However, utilizing one vector to represent the sentence is limited, as the natural languages are delicate and complex In this paper, we propose a knowledge guided capsule network (KGCapsAN), which can address the above deficiencies Our method is composed of two parts, a Bi-LSTM network and a capsule attention network The capsule attention network implements the routing method by attention mechanism Moreover, we utilize two prior knowledge to guide the capsule attention process, which are syntactical and n-gram structures Extensive experiments are conducted on six datasets, and the results show that the proposed method yields the state-of-the-art

51 citations

Journal ArticleDOI
TL;DR: A novel relational graph attention network that integrates typed syntactic dependency information is investigated that significantly outperforms state-of-the-art syntax-based approaches for targeted sentiment classification performances.
Abstract: Targeted sentiment classification predicts the sentiment polarity on given target mentions in input texts. Dominant methods employ neural networks for encoding the input sentence and extracting relations between target mentions and their contexts. Recently, graph neural network has been investigated for integrating dependency syntax for the task, achieving the state-of-the-art results. However, existing methods do not consider dependency label information, which can be intuitively useful. To solve the problem, we investigate a novel relational graph attention network that integrates typed syntactic dependency information. Results on standard benchmarks show that our method can effectively leverage label information for improving targeted sentiment classification performances. Our final model significantly outperforms state-of-the-art syntax-based approaches.

48 citations

Posted Content
Heng Yang1, Biqing Zeng1, Jianhao Yang1, Youwei Song2, Ruyang Xu1 
TL;DR: Based on the local context focus (LCF) mechanism, this article proposed a multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC.
Abstract: Aspect-based sentiment analysis (ABSA) task is a multi-grained task of natural language processing and consists of two subtasks: aspect term extraction (ATE) and aspect polarity classification (APC). Most of the existing work focuses on the subtask of aspect term polarity inferring and ignores the significance of aspect term extraction. Besides, the existing researches do not pay attention to the research of the Chinese-oriented ABSA task. Based on the local context focus (LCF) mechanism, this paper firstly proposes a multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC. Compared with existing models, this model equips the capability of extracting aspect term and inferring aspect term polarity synchronously, moreover, this model is effective to analyze both Chinese and English comments simultaneously and the experiment on a multilingual mixed dataset proved its availability. By integrating the domain-adapted BERT model, the LCF-ATEPC model achieved the state-of-the-art performance of aspect term extraction and aspect polarity classification in four Chinese review datasets. Besides, the experimental results on the most commonly used SemEval-2014 task4 Restaurant and Laptop datasets outperform the state-of-the-art performance on the ATE and APC subtask.

39 citations