scispace - formally typeset
Search or ask a question

Showing papers by "Hans Gellersen published in 2018"


Proceedings ArticleDOI
14 Jun 2018
TL;DR: Smooth-i as discussed by the authors uses motion matching of smooth pursuit eye movements and known motion on the display to determine when there is a drift in accuracy and use it as input for re-calibration.
Abstract: Eye gaze for interaction is dependent on calibration. However, gaze calibration can deteriorate over time affecting the usability of the system. We propose to use motion matching of smooth pursuit eye movements and known motion on the display to determine when there is a drift in accuracy and use it as input for re-calibration. To explore this idea we developed Smooth-i, an algorithm that stores calibration points and updates them incrementally when inaccuracies are identified. To validate the accuracy of Smooth-i, we conducted a study with five participants and a remote eye tracker. A baseline calibration profile was used by all participants to test the accuracy of the Smooth-i re-calibration following interaction with moving targets. Results show that Smooth-i is able to manage re-calibration efficiently, updating the calibration profile only when inaccurate data samples are detected.

14 citations


Journal ArticleDOI
TL;DR: The results shows that changes in the muscle strength as an effect of aging might affect, directly or indirectly, the post‐saccadic oscillations, and suggests that aging has to be considered as an important factor when studying the post-Saccadic eye movements.

13 citations


Proceedings ArticleDOI
23 Oct 2018
TL;DR: This workshop will share experiences in the development of gaze-enabled games, discuss best practices and tools, and explore future challenges for research at the intersection of eye tracking and games.
Abstract: What are current and future challenges that incorporating eye tracking into game design and development creates? The Second EyePlay workshop brings together academic researchers and industry practitioners from the fields of eye tracking and games to explore these questions. In recent years, gaming has been at the forefront of the commercial popularization of eye tracking. In this workshop, we will share experiences in the development of gaze-enabled games, discuss best practices and tools, and explore future challenges for research. Topics of interest lie at the intersection of eye tracking and games including, but limited to, novel interaction techniques and game mechanics, development processes and tools, accessible games, evaluation, and future visions.

7 citations


Proceedings ArticleDOI
14 Jun 2018
TL;DR: The gaze behaviour during hovers is studied, and it is found that the distance between gaze and hand depends on the target's location on the screen and how indecision can be deducted from this distance.
Abstract: Taps only consist of a small part of the manual input when interacting with touch-enabled surfaces. Indeed, how the hand behaves in the hovering space is informative of what the user intends to do. In this article, we present a data collection related to hand and eye motion. We tailored a kiosk-like system to record participants' gaze and hand movements. We specifically designed a memory game to detect the decision-making process users may face. Our data collection comprises of 177 trials from 71 participants. Based on a hand movement classification, we extracted 16588 hovers. We study the gaze behaviour during hovers, and we found out that the distance between gaze and hand depends on the target's location on the screen. We also showed how indecision can be deducted from this distance.

6 citations


Journal ArticleDOI
18 Sep 2018
TL;DR: It is concluded that visual attention-based access can be a useful addition to groupware to flexibly facilitate awareness and prevent conflicts.
Abstract: During collaboration, individual users' capacity to maintain awareness, avoid duplicate work and prevent conflicts depends on the extent to which they are able to monitor the workspace Existing access control models disregard this contextual information by managing access strictly based on who performs the action As an alternative approach, we propose managing access by taking the visual attention of collaborators into account For example, actions that require consensus can be limited to collaborators' joint attention, editing another user's personal document can require her visual supervision and private information can become unavailable when another user is looking We prototyped visual attention-based access for 3 collaboration scenarios on a large vertical display using head orientation input as a proxy for attention The prototype was deployed for an exploratory user study, where participants in pairs were tasked to assign visual attention-based access to various actions The results reveal distinct motivations for their use such as preventing accidents, maintaining individual control and facilitating group awareness Visual attention-based access has been perceived as more convenient but also less certain when compared to traditional access control We conclude that visual attention-based access can be a useful addition to groupware to flexibly facilitate awareness and prevent conflicts

3 citations