scispace - formally typeset
Search or ask a question

Showing papers by "Georgios Triantafyllidis published in 2014"


Journal ArticleDOI
TL;DR: By creating an educational computer game, which provides physical interaction with natural user interface (NUI), this work aims to support early intervention and to enhance emotion recognition skills of preschoolers with autism.
Abstract: Emotion recognition is essential in human communication and social interaction. Children with autism have been reported to exhibit deficits in understanding and expressing emotions. Those deficits seem to be rather permanent so intervention tools for improving those impairments are desirable. Educational interventions for teaching emotion recognition should occur as early as possible. It is argued that Serious Games can be very effective in the areas of therapy and education for children with autism. However, those computer interventions require considerable skills for interaction. Before the age of 6, most children with autism do not have such basic motor skills in order to manipulate a mouse or a keyboard. Our approach takes account of the specific characteristics of preschoolers with autism and their physical inabilities. By creating an educational computer game, which provides physical interaction with natural user interface (NUI), we aim to support early intervention and to enhance emotion recognition skills.

25 citations


Book ChapterDOI
22 Jun 2014
TL;DR: The IOLAOS project, a general open authorable framework for educational games for children, features an editor, where the game narrative can be created or edited, according to specific needs, and takes account of the specific characteristics of preschoolers with autism diagnosis and their physical abilities for customizing accordingly the narrative of the game.
Abstract: This paper presents the initial findings and the on-going work of IOLAOS project, a general open authorable framework for educational games for children. This framework features an editor, where the game narrative can be created or edited, according to specific needs. A ludic approach is also used both for the interface as well as for the game design. More specifically, by employing physical and natural user interface (NUI), we aim to achieve ludic interfaces. Moreover, by designing the educational game with playful elements, we follow a ludic design. This framework is then applied for the scenario of teaching preschoolers with autism diagnosis. Children with autism have been reported to exhibit deficits in the recognition of affective expressions and the perception of emotions. With the appropriate intervention, elimination of those deficits can be achieved. Interventions are proposed to start as early as possible. Computer-based programs have been widely used with success to teach people with autism to recognize emotions. However, those computer interventions require considerable skills for interaction. Such abilities are beyond very young children with autism as most probably they don’t have the skills to interact with computers. In this context, our approach with the suggested framework employs a ludic interface based on NUI, a ludic game design and takes account of the specific characteristics of preschoolers with autism diagnosis and their physical abilities for customizing accordingly the narrative of the game.

16 citations


Proceedings ArticleDOI
23 Oct 2014
TL;DR: It is possible to estimate the player experience in a non-invasive fashion during the game and, based on this information, the game content could be adapted accordingly.
Abstract: In this paper, we investigate on the relationship between player experience and body movements in a non-physical 3D computer game. During an experiment, the participants played a series of short game sessions and rated their experience while their body movements were tracked using a depth camera. The data collected was analysed and a neural network was trained to find the mapping between player body movements, player ingame behaviour and player experience. The results reveal that some aspects of player experience, such as anxiety or challenge, can be detected with high accuracy (up to 81%). Moreover, taking into account the playing context, the accuracy can be raised up to 86%. Following such a multi-modal approach, it is possible to estimate the player experience in a non-invasive fashion during the game and, based on this information, the game content could be adapted accordingly.

6 citations


Proceedings ArticleDOI
28 Jul 2014
TL;DR: By removing the gesture recognition concept, this work is able to create a generic and lightweight framework, with a clear interface to the user, using the inputs from Microsoft Kinect as a controller interface, targeting multimedia content creation.
Abstract: Applications for real-time multimedia content production, because of their delay-sensitive nature, require fast and precise control by the user. This is commonly achieved by specialized physical controllers that are application-specific with steep learning curves. In our work, we propose using the inputs from Microsoft Kinect as a controller interface. Originally introduced as a peripheral of XBox, Kinect is a multimodal device equipped with RGB Camera, Depth Sensor and Microphone Array. We use those inputs in order to provide a non-tactile controller abstraction to the user, targeting multimedia content creation. Current Kinect-based solutions, try to recognize natural gestures of the user, and classify them as controller actions. The novelty of our implementation is that instead of extracting gesture features, we directly map the inputs from the Kinect to a suitable set of values for the multimedia application. By removing the gesture recognition concept, we are able to create a generic and lightweight framework, with a clear interface to the user. We examine the usability of the framework through the development and evaluation, of a Kinect-controlled real-time multimedia application.

6 citations



Book ChapterDOI
22 Jun 2014
TL;DR: An approach for teaching and designing embodied interaction based on interactive sketches that has combined the mover perspective and felt experiences of movement with advanced technologies in a generative design session is presented.
Abstract: We present an approach for teaching and designing embodied interaction based on interactive sketches. We have combined the mover perspective and felt experiences of movement with advanced technologies (multi-agents, physical simulations) in a generative design session. We report our activities and provide a simple example as a design outcome. The variety and the qualities of the initial ideas indicate that this approach might provide a better foundation for our participants, compared to the approaches that focus only on technologies. The interactive sketches will be demonstrated at the conference.

3 citations