scispace - formally typeset
S

Sasa Bodiroza

Researcher at Humboldt University of Berlin

Publications -  8
Citations -  79

Sasa Bodiroza is an academic researcher from Humboldt University of Berlin. The author has contributed to research in topics: Gesture recognition & Social robot. The author has an hindex of 5, co-authored 8 publications receiving 77 citations. Previous affiliations of Sasa Bodiroza include Humboldt State University.

Papers
More filters
Proceedings ArticleDOI

Position-invariant, real-time gesture recognition based on dynamic time warping

TL;DR: A gesture recognition algorithm, based on dynamic time warping, was implemented with a use-case scenario of natural interaction with a mobile robot and the experimental results show that the proposed modifications of the standard gesture recognition algorithms improve the robustness of the recognition.
Proceedings ArticleDOI

Dynamic gesture vocabulary design for intuitive human-robot dialog

TL;DR: This paper presents a generalized method for the design of a gesture vocabulary (GV) for intuitive and natural two-way human-robot dialogs and preliminary experimental results indicate the unique nature of the HGV obtained.
Proceedings ArticleDOI

Spatially unconstrained, gesture-based human-robot interaction

TL;DR: This work presents a robotic platform capable of autonomously tracking the person, estimating their position and following them, while recognizing their gestures and navigating through environment.
Proceedings ArticleDOI

Robot ego-sphere: An approach for saliency detection and attention manipulation in humanoid robots for intuitive interaction

TL;DR: By creating a saliency based attentional model and combining it with a robot ego-sphere, the robot can engage in an interaction with a human, starting an interaction game including objects as a first step towards a joint attention.
Proceedings ArticleDOI

Learning hand-eye coordination for a humanoid robot using SOMs

TL;DR: This work attempts to explain how pointing emerges from sensorimotor learning of hand-eye coordination in a humanoid robot, and shows that a model implemented on a robotic platform accounts for pointing behavior while humans present objects out of reach of the robot's hand.