scispace - formally typeset
Search or ask a question
Topic

Humanoid robot

About: Humanoid robot is a research topic. Over the lifetime, 14387 publications have been published within this topic receiving 243674 citations. The topic is also known as: 🤖.


Papers
More filters
Proceedings ArticleDOI
24 Jun 2012
TL;DR: HEFES (Hybrid Engine for Facial Expressions Synthesis), an engine for generating and controlling facial expressions both on physical androids and 3D avatars is described.
Abstract: Nowadays advances in robotics and computer science have made possible the development of sociable and attractive robots. A challenging objective of the field of humanoid robotics is to make robots able to interact with people in a believable way. Recent studies have demonstrated that human-like robots with high similarity to human beings do not generate the sense of unease that is typically associated to human-like robots. For this reason designing of aesthetically appealing and socially attractive robots becomes necessary for realistic human-robot interactions. In this paper HEFES (Hybrid Engine for Facial Expressions Synthesis), an engine for generating and controlling facial expressions both on physical androids and 3D avatars is described. HEFES is part of a software library that controls a human robot called FACE (Facial Automaton for Conveying Emotions). HEFES was designed to allow users to create facial expressions without requiring artistic or animatronics skills and it is able to animate both FACE and its 3D replica. The system was tested in human-robot interaction studies aimed to help children with autism to interpret their interlocutors' mood through facial expressions understanding.

66 citations

Proceedings ArticleDOI
01 Dec 2008
TL;DR: A new bipedal walking robot, named KOBIAN, which is also capable to express human-like emotions and the presence of a full body clearly enhances the emotion expression capability of the robot, thus proving the effectiveness of the proposed approach.
Abstract: Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. In particular, these robots are expected to be fundamental for helping and assisting elderly and disabled people during their activities of daily living (ADLs). To achieve this result, personal robots should be capable of human-like emotion expressions; in addition, human-like bipedal walking is the best solution for the robots which should be active in the human living environment. Although several bipedal robots and several emotional expression robots have been developed in the recent years, until now there was no robot which integrated all these functions. Therefore we developed a new bipedal walking robot, named KOBIAN, which is also capable to express human-like emotions. In this paper, we present the design and the preliminary evaluation of the new emotional expression head. The preliminary results showed that the emotion expressed by only the head cannot be really easily understood by the users. However, the presence of a full body clearly enhances the emotion expression capability of the robot, thus proving the effectiveness of the proposed approach.

66 citations

Journal ArticleDOI
TL;DR: This paper presents a stroke rehabilitation (SR) system for the upper limbs, developed as an interactive virtual environment (IVE) based on a commercial 3D vision system, a humanoid robot, and devices producing ergonometric signals.

66 citations

Journal ArticleDOI
TL;DR: An architecture based on deep networks, which is used by the humanoid robot iCub to learn a task from multiple perceptual modalities is proposed, which performs a substantial dimensionality reduction by providing both a symbolic representation of data and a fine discrimination between two similar stimuli.

66 citations

Proceedings ArticleDOI
28 Sep 2004
TL;DR: The results demonstrate the effectiveness of using the delayed temporal contingency in the action-perception loop as a basis for simple self-other discrimination and suggest potential applications in social robotics and in generating forward models of motion.
Abstract: We present a method for allowing a humanoid robot to recognize its own motion in its visual field, thus enabling it to distinguish itself from other agents in the vicinity. Our approach consists of learning a characteristic time window between the initiation of motor movement and the perception of arm motions. The method has been implemented and evaluated on an infant humanoid platform. Our results demonstrate the effectiveness of using the delayed temporal contingency in the action-perception loop as a basis for simple self-other discrimination. We conclude by suggesting potential applications in social robotics and in generating forward models of motion.

66 citations


Network Information
Related Topics (5)
Mobile robot
66.7K papers, 1.1M citations
96% related
Robot
103.8K papers, 1.3M citations
95% related
Adaptive control
60.1K papers, 1.2M citations
84% related
Control theory
299.6K papers, 3.1M citations
83% related
Object detection
46.1K papers, 1.3M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023259
2022775
2021576
2020648
2019802
2018922