scispace - formally typeset
Search or ask a question
Author

Christos Gatzoulis

Bio: Christos Gatzoulis is an academic researcher from Bahrain Polytechnic. The author has contributed to research in topics: Reinforcement learning & Instructional simulation. The author has an hindex of 4, co-authored 13 publications receiving 82 citations. Previous affiliations of Christos Gatzoulis include Teesside University & Cyprus University of Technology.

Papers
More filters
Journal ArticleDOI
TL;DR: This survey describes emerging techniques and modalities related to emotion recognition based on body movement as well as recent advances in automatic emotion recognition and describes application areas and notation systems and explains the importance of movement segmentation.
Abstract: Humans are emotional beings, and their feelings influence how they perform and interact with computers One of the most expressive modalities for humans is body posture and movement, which researchers have recently started exploiting for emotion recognition This survey describes emerging techniques and modalities related to emotion recognition based on body movement, as well as recent advances in automatic emotion recognition It also describes application areas and notation systems and explains the importance of movement segmentation It then discusses unsolved problems and provides promising directions for future research The Web extra (a PDF file) contains tables with additional information related to the article

47 citations

Proceedings ArticleDOI
06 Nov 2013
TL;DR: A set of body motion features, based on the Effort component of Laban Movement Analysis, are proposed that are used to provide sets of classifiers for emotion recognition in a game scenario for four emotional states: concentration, meditation, excitement and frustration.
Abstract: Exergames do not have the capacity to detect whether the players are really enjoying the game-play. The games are not intelligent enough to detect significant emotional states and adapt accordingly in order to offer a better user experience for the players. We propose a set of body motion features, based on the Effort component of Laban Movement Analysis (LMA), that are used to provide sets of classifiers for emotion recognition in a game scenario for four emotional states:concentration, meditation, excitement and frustration. Experimental results show that, the system is capable of successfully recognizing the four different emotional states at a very high rate.

38 citations

Journal ArticleDOI
TL;DR: This empirical study sheds light on temporality of UX and attractiveness of serious games, exploring how pragmatic and hedonic UX quality affects attractiveness in a serious game and investigates differences between anticipated and episodic UX so as to capture how the UX develops over time.
Abstract: The concept and methods of user experience UX are gaining momentum in the game industry Designers and educational practitioners aim to provide rich and effective user experience through serious educational games Nevertheless several phenomena that delineate the complex issue of UX in serious gaming remain unexplored This empirical study sheds light on temporality of UX and attractiveness of serious games More specifically it explores a how pragmatic and hedonic UX quality affects attractiveness in a serious game and b investigates differences between anticipated and episodic UX so as to capture how the UX develops over time Key findings are presented and discussed

6 citations

01 Jan 2013
TL;DR: This paper presents work-in-progress towards novel recognition of player’s emotions using posture skeleton data as input from non-intrusive interfaces and indicates that the compiled database of postures annotated with emotion labels performs considerably above chance level recognition of emotions.
Abstract: The affective state of a player during game playing has a significant effect on the player’s motivation and engagement. Recognising player’s emotions during games can help game designers improve the user experience by providing sophisticated behaviours to the game characters and the system itself. This paper presents work-in-progress towards novel recognition of player’s emotions using posture skeleton data as input from non-intrusive interfaces. A database of samples of non-acted posture skeleton data was captured during active game playing using Microsoft Kinect’s sensors. Four observers were asked to annotate the selected postures with an emotion label from a given emotion set. Based on Cohen’s kappa, the agreement level of the observers was above or equal to ‘good’ with overall agreement levels that outperform existing benchmarks. The data was used in a series of experiments for training the system in recognising emotions. The results indicate that the compiled database of postures annotated with emotion labels performs considerably above chance level recognition of emotions and offers interesting research questions for improvements and future directions in the area.

5 citations

Journal ArticleDOI
22 Aug 2016
TL;DR: Results from a study on the feasibility of creating both a computer games industry and research sector are presented, which includes categories of essential requirements identified by the stakeholders.
Abstract: A games industry can help improve a country’s financial situation by increasing its GDP through the creation of many new job opportunities. Cyprus may be a suitable location for such an industry, mainly because of the high availability of graduates, and its strong service industry. In this work, results from a study on the feasibility of creating both a computer games industry and research sector are presented. Key stakeholders including students, and representatives from government ministries, academics, incubator companies and other companies were brought together to evaluate the current Cypriot economic situation, and to create a roadmap of actions required to create a successful computer games industry. The methodology included a Structured Dialogue Design Process that was used to identify requirements. The findings from this study include categories of essential requirements identified by the stakeholders. The data in this study could be used to plan for and devise a comprehensive roadmap for creating a computer games sector that includes games companies and research centers.

4 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The emotion recognition methods based on multi-channel EEG signals as well as multi-modal physiological signals are reviewed and the correlation between different brain areas and emotions is discussed.

281 citations

Journal ArticleDOI
TL;DR: It is argued that ICT mediated or generated social touch can (a) intensify the perceived social presence of remote communication partners and (b) enable computer systems to more effectively convey affective information.
Abstract: Touch is our primary non-verbal communication channel for conveying intimate emotions and as such essential for our physical and emotional wellbeing. In our digital age, human social interaction is often mediated. However, even though there is increasing evidence that mediated touch affords affective communication, current communication systems (such as videoconferencing) still do not support communication through the sense of touch. As a result, mediated communication does not provide the intense affective experience of co-located communication. The need for ICT mediated or generated touch as an intuitive way of social communication is even further emphasized by the growing interest in the use of touch-enabled agents and robots for healthcare, teaching, and telepresence applications. Here, we review the important role of social touch in our daily life and the available evidence that affective touch can be mediated reliably between humans and between humans and digital agents. We base our observations on evidence from psychology, computer science, sociology, and neuroscience with focus on the first two. Our review shows that mediated affective touch can modulate physiological responses, increase trust and affection, help to establish bonds between humans and avatars or robots, and initiate pro-social behavior. We argue that ICT mediated or generated social touch can (a) intensify the perceived social presence of remote communication partners and (b) enable computer systems to more effectively convey affective information. However, this research field on the crossroads of ICT and psychology is still embryonic and we identify several topics that can help to mature the field in the following areas: establishing an overarching theoretical framework, employing better research methodologies, developing basic social touch building blocks, and solving specific ICT challenges.

136 citations

Journal ArticleDOI
TL;DR: This study aimed to identify, using Laban Movement Analysis (LMA), the Laban motor elements that characterize movements whose execution enhances each of the basic emotions: anger, fear, happiness, and sadness.
Abstract: We have recently demonstrated that motor execution, observation, and imagery of movements expressing certain emotions can enhance corresponding affective states and therefore could be used for emotion regulation. But which specific movement(s) should one use in order to enhance each emotion? This study aimed to identify, using Laban Movement Analysis (LMA), the Laban motor elements (motor characteristics) that characterize movements whose execution enhances each of the basic emotions: anger, fear, happiness, and sadness. LMA provides a system of symbols describing its motor elements, which gives a written instruction (motif) for the execution of a movement or movement-sequence over time. Six senior LMA experts analyzed a validated set of video clips showing whole body dynamic expressions of anger, fear, happiness and sadness, and identified the motor elements that were common to (appeared in) all clips expressing the same emotion. For each emotion, we created motifs of different combinations of the motor elements common to all clips of the same emotion. Eighty subjects from around the world read and moved those motifs, to identify the emotion evoked when moving each motif and to rate the intensity of the evoked emotion. All subjects together moved and rated 1241 motifs, which were produced from 29 different motor elements. Using logistic regression, we found a set of motor elements associated with each emotion which, when moved, predicted the feeling of that emotion. Each emotion was predicted by a unique set of motor elements and each motor element predicted only one emotion. Knowledge of which specific motor elements enhance specific emotions can enable emotional self-regulation through adding some desired motor qualities to one's personal everyday movements (rather than mimicking others' specific movements) and through decreasing motor behaviors which include elements that enhance negative emotions.

78 citations

Book ChapterDOI
26 Mar 2019
TL;DR: This paper investigates the possibility of automated emotion representation, recognition and prediction its state-of-the-art and main directions for further research, focusing on the impact of emotion analysis and state of the arts of multimodal emotion detection.
Abstract: Automatic emotion recognition constitutes one of the great challenges providing new tools for more objective and quicker diagnosis, communication and research. Quick and accurate emotion recognition may increase possibilities of computers, robots, and integrated environments to recognize human emotions, and response accordingly to them a social rules. The purpose of this paper is to investigate the possibility of automated emotion representation, recognition and prediction its state-of-the-art and main directions for further research. We focus on the impact of emotion analysis and state of the arts of multimodal emotion detection. We present existing works, possibilities and existing methods to analyze emotion in text, sound, image, video and physiological signals. We also emphasize the most important features for all available emotion recognition modes. Finally, we present the available platform and outlines the existing projects, which deal with multimodal emotion analysis.

78 citations