scispace - formally typeset
R

Ruyang Xu

Researcher at South China Normal University

Publications -  6
Citations -  228

Ruyang Xu is an academic researcher from South China Normal University. The author has contributed to research in topics: Sentiment analysis & Context (language use). The author has an hindex of 3, co-authored 6 publications receiving 77 citations.

Papers
More filters
Journal ArticleDOI

LCF: A Local Context Focus Mechanism for Aspect-Based Sentiment Classification

TL;DR: A Local Context Focus (LCF) mechanism is proposed for aspect-based sentiment classification based on Multi-head Self-Attention (MHSA), and utilizes the Context features Dynamic Mask (CDM) and Context Features Dynamic Weighted (CDW) layers to pay more attention to the local context words.
Journal ArticleDOI

A multi-task learning model for Chinese-oriented aspect polarity classification and aspect term extraction

TL;DR: A multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC, which equips the capability of extracting aspect term and inferring aspect term polarity synchronously and is effective to analyze both Chinese and English comments simultaneously.
Posted Content

A Multi-task Learning Model for Chinese-oriented Aspect Polarity Classification and Aspect Term Extraction

TL;DR: Based on the local context focus (LCF) mechanism, this article proposed a multi-task learning model for Chinese-oriented aspect-based sentiment analysis, namely LCF-ATEPC.
Journal ArticleDOI

Multifeature Interactive Fusion Model for Aspect-Based Sentiment Analysis

TL;DR: This work proposes a multifeature interactive fusion model for aspect-based sentiment analysis that has a better performance compared with the baseline models and applies the attention mechanism to calculate fusion weight of features, so that the key features information plays a more significant role in the sentiment analysis.
Journal ArticleDOI

Comprehensive Document Summarization with Refined Self-Matching Mechanism

TL;DR: In this work, the self-matching mechanism is incorporated into the extractive summarization system at the encoder side, which allows theencoder to optimize the encoding information at the global level and effectively improves the memory capacity of conventional LSTM.