scispace - formally typeset
Search or ask a question
Topic

Humanoid robot

About: Humanoid robot is a research topic. Over the lifetime, 14387 publications have been published within this topic receiving 243674 citations. The topic is also known as: 🤖.


Papers
More filters
Journal Article•DOI•
TL;DR: The results indicate that the physical embodiment plays a significant role on improving the children’s performance, engagement and motivation.
Abstract: This paper presents interactive games for sign language tutoring assisted by humanoid robots. The games are specially designed for children with communication impairments. In this study, different robot platforms such as a Nao H25 and a Robovie R3 humanoid robots are used to express a set of chosen signs in Turkish Sign Language using hand and arm movements. Two games involving physically and virtually embodied robots are designed. In the game involving physically embodied robot, the robot is able to communicate with the participant by recognizing colored flashcards through a camera based system and generating a selected subset of signs including motivational facial gestures, in return. A mobile version of the game is also implemented to be used as part of children’s education and therapy for the purpose of teaching signs. The humanoid robot acts as a social peer and assistant in the games to motivate the child, teach a selected set of signs, evaluate the child’s effort, and give appropriate feedback to improve the learning and recognition rate of children. Current paper presents results from the preliminary study with different test groups, where children played with the physical robot platform, R3, and a mobile game incorporating the videos of the robot performing the signs, thus the effect of assistive robot’s embodiment is analyzed within these games. The results indicate that the physical embodiment plays a significant role on improving the children’s performance, engagement and motivation.

60 citations

Journal Article•DOI•
TL;DR: A shared sensorimotor map of the environment is configured and trained by the coordinated control of eye and arm movements, built on a radial basis function framework and suitable for the problem at hand, and for its implementation on a real humanoid robot.
Abstract: Primates often perform coordinated eye and arm movements, contextually fixating and reaching towards nearby objects. This combination of looking and reaching to the same target is used by infants to establish an implicit visuomotor representation of the peripersonal space, useful for both oculomotor and arm motor control. In this work, taking inspiration from such behavior and from primate visuomotor mechanisms, a shared sensorimotor map of the environment, built on a radial basis function framework, is configured and trained by the coordinated control of eye and arm movements. Computational results confirm that the approach seems especially suitable for the problem at hand, and for its implementation on a real humanoid robot. By exploratory gazing and reaching actions, either free or goal-based, the artificial agent learns to perform direct and inverse transformations between stereo vision, oculomotor, and joint-space representations. The integrated sensorimotor map that allows to contextually represent the peripersonal space through different vision and motor parameters is never made explicit, but rather emerges thanks to the interaction of the agent with the environment.

60 citations

Journal Article•DOI•
TL;DR: The study tested children’s ability to recognize the emotional body language displayed by a humanoid robot and suggested that body postures and head position can be used to convey emotions during child-robot interaction.
Abstract: The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also that changing the head position affects the expressiveness of the key poses in a consistent way Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up produces an increase along these dimensions Hence, changing the head position during an interaction should send intuitive signals The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot The results suggest that body postures and head position can be used to convey emotions during child-robot interaction

60 citations

Proceedings Article•DOI•
24 Dec 2012
TL;DR: From the study of how human dyads achieve such a task, a control law for physical interaction is developed that unifies standalone and collaborative modes for trajectory-based tasks.
Abstract: In this paper, we propose a control scheme that allows a humanoid robot to perform a transportation task jointly with a human partner. From the study of how human dyads achieve such a task, we have developed a control law for physical interaction that unifies standalone and collaborative (leader and follower) modes for trajectory-based tasks. We present it in the case of a linear impedance controller but it can be generalized to more complex impedances. Desired trajectories are decomposed into sequences of elementary motion primitives. We implemented this model with a Finite State Machine associated with a reactive pattern generator. First experiments conducted on a real HRP-2 humanoid robot assess the overall approach.

60 citations

Proceedings Article•DOI•
20 Jul 2003
TL;DR: A new small humanoid type robot, SDR-4X, for entertainment purpose is described, which further expands its ability of human interaction and robust adaptability in home environment.
Abstract: In this paper we describe a new small humanoid type robot, SDR-4X, for entertainment purpose. Comparing the previous version of SDR-3X, we further expand its ability of human interaction and robust adaptability in home environment. The main technical feature of SDR-4X is four folds: (1) a Real-time Integrated Adaptive Motion Control, (2) the "SDR Motion Creator ", for development of SDR's attractive motion performances, (3) "Real-time real-world Space Perception", and (4) "Multimodal Human Robot Interaction". Overview of these technologies and its performance are presented in this paper.

60 citations


Network Information
Related Topics (5)
Mobile robot
66.7K papers, 1.1M citations
96% related
Robot
103.8K papers, 1.3M citations
95% related
Adaptive control
60.1K papers, 1.2M citations
84% related
Control theory
299.6K papers, 3.1M citations
83% related
Object detection
46.1K papers, 1.3M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023253
2022759
2021573
2020647
2019801
2018921