scispace - formally typeset
H

Hans Gellersen

Researcher at Lancaster University

Publications -  228
Citations -  10176

Hans Gellersen is an academic researcher from Lancaster University. The author has contributed to research in topics: Eye tracking & Gaze. The author has an hindex of 51, co-authored 214 publications receiving 9010 citations. Previous affiliations of Hans Gellersen include Karlsruhe Institute of Technology & Aarhus University.

Papers
More filters

Ubiquitous Computing for Sustainable Energy (UCSE2010) Ubicomp 2010 Workshop

TL;DR: This workshop hopes to get people from different disciplines together to share their visions and insights on how to conserve, efficiently produce, use, and distribute energy.
Proceedings ArticleDOI

Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input

TL;DR: In this article , the authors developed HeadBoost as a novel classifier, achieving high accuracy in classifying gaze-driven versus gestural head movement (F1-score: 0.89) and demonstrated the utility of the classifier with three applications: target selection with Head-Gaze while avoiding Midas Touch by head gestures.
Proceedings ArticleDOI

Space-multiplexed input on mouse-extended notebooks

TL;DR: An empirical study that evaluates the practical applicability of space-multiplexed input to mouse-extended notebooks - a common configuration which integrates both a touchpad and an external mouse shows that two-handed input can instantly be performed by subjects without significant loss in performance.

Integrating spatial information in a user interface

TL;DR: This poster introduces the approach to integrating spatial information into the user interface for cross-device interaction in an effective way, and presents preliminary results.
Proceedings ArticleDOI

GE-Simulator: An Open-Source Tool for Simulating Real-Time Errors for HMD-based Eye Trackers

TL;DR: GE-Simulator as discussed by the authors is an open-source Unity toolkit that allows the simulation of accuracy, precision, and data loss errors during real-time usage by adding gaze vector errors into the gaze vector from the head-mounted AR/VR eye tracker.