scispace - formally typeset
F

Fuji Ren

Researcher at University of Tokushima

Publications -  622
Citations -  6519

Fuji Ren is an academic researcher from University of Tokushima. The author has contributed to research in topics: Sentence & Machine translation. The author has an hindex of 30, co-authored 579 publications receiving 4966 citations. Previous affiliations of Fuji Ren include Hiroshima City University & Beijing University of Posts and Telecommunications.

Papers
More filters
Proceedings ArticleDOI

Estimating human emotions using wording and sentence patterns

TL;DR: In this paper, an emotion estimation module is proposed to recognize and create human emotions which can be applied to a robot used for welfare services, and the module extracts "emotion occurrence condition" based on emotions contained in words and sentence meanings, and estimates the emotion of a speaker or writer.
Proceedings ArticleDOI

Semi-automatic emotion recognition from textual input based on the constructed emotion thesaurus

TL;DR: This paper sets up a system to recognize the affect sensing from the sentence or the text and constructs a Chinese language emotion thesaurus, including functions of the syntax analysis, accidence analysis, emotion sensing and emotion computing.
Proceedings ArticleDOI

Reduce the dimensions of emotional features by principal component analysis for speech emotion recognition

TL;DR: The principal component analysis (PCA) is applied to speech emotion recognition for improving the accuracy of the system and the recognition accuracies among different emotions in both databases are presented.
Posted Content

EmoSense: Computational Intelligence Driven Emotion Sensing via Wireless Channel Data.

TL;DR: Emsense only leverages the low-cost and prevalent WiFi infrastructures and thus, constitutes a tempting solution for emotion sensing and achieves a comparable performance to the vision-based and sensor-based rivals under different scenarios.
Proceedings ArticleDOI

EEG Emotion Recognition Based on Granger Causality and CapsNet Neural Network

TL;DR: The experimental results show that by adjusting the model parameters and network structure, the constructed CapsNet neural network performs emotional classification on EEG signals, and obtains 88.09% and 87.37% average classification accuracy under valence and arousal emotion dimensions, compared with SVM and CNN.