scispace - formally typeset
Search or ask a question
Topic

Humanoid robot

About: Humanoid robot is a research topic. Over the lifetime, 14387 publications have been published within this topic receiving 243674 citations. The topic is also known as: 🤖.


Papers
More filters
Proceedings ArticleDOI
10 Dec 2007
TL;DR: A constraint-based contact force solver and virtual spring-damper joints are proposed to simulate a shock absorbing mechanism that many biped humanoid robots have in their feet to increase the stability of walking motion.
Abstract: We propose a simulation system that achieves realistic and efficient simulations of humanoid robots. This paper focuses on a constraint-based contact force solver and virtual spring-damper joints from among the components of the system. The contact force solver can accurately simulate contacts between rigid bodies including articulated rigid bodies. LCP-like formulation of constraint conditions is solved by an iterative calculation method that extends the Gauss-Seidel method. This paper clarifies how to integrate existing methods to implement a robust and efficient solver. Virtual spring-damper joints are proposed to simulate a shock absorbing mechanism that many biped humanoid robots have in their feet to increase the stability of walking motion. The combination of the rigid contact model and the elastic virtual joints can improve the accuracy of the simulation. The simulation system was verified by experiments using humanoid robot HRP-2, and the results shows the validity of the system.

88 citations

Proceedings ArticleDOI
01 Dec 2006
TL;DR: An empathic anthropomorphic robot (torso) that mirrors the emotions happiness, fear and neutral as recognised from the speech signal by facial expressions is presented.
Abstract: Current research has identified the need to equip robots with perceptual capabilities that not only recognise objective entities such as visual or auditory objects but that are also capable of assessing the affective evaluations of the human communication partner in order to render the communication situation more natural and social In equivalence to Watzlawick's statement that "one cannot not communicate" (1968) it has been found that also in human-robot interactions one cannot be not emotional It is therefore crucial for a robot to understand these affective signals of its communication partner and react towards them However, up to now, online emotion recognition in realtime, interactive systems has scarcely been attempted as apparently demands concerning robustness and time constraints are very high In this paper we present an empathic anthropomorphic robot (torso) that mirrors the emotions happiness, fear and neutral as recognised from the speech signal by facial expressions The recognition component as well as the development of the facial expression generation are described in detail We report on results from experiments with humans interacting with the empathic robot

88 citations

Journal ArticleDOI
TL;DR: It is shown that frequently updating the motion pattern contributes to maintaining long-term balance while performing online walking control and an extension of the short cycle pattern generation method that can accommodate external forces measured online is presented.
Abstract: The present paper presents an online walking control system that frequently generates and updates dynamically stable motion patterns with a cycle time of 20 ms. We show that frequently updating the motion pattern contributes to maintaining long-term balance while performing online walking control. In addition, the system enables a robot to respond quickly to changes in the commanded walking direction. Using preview control theory, we generate dynamically stable walking patterns. We propose a method to adjust the future desired zero moment point (ZMP) by modifying the foot landing position in order to maintain the dynamic balance of the generated motion pattern. This technique can be used to filter input commands that would result in sudden changes to the foot landing position, which would result in dynamic instability. The method is also used to compensate for errors between the actual and desired ZMP due to disturbances encountered while walking. We also present an extension of the short cycle pattern generation method that can accommodate external forces measured online. Experimental results for activities such as pushing a table are demonstrated on the full-size humanoid HRP-2 to evaluate the performance of the proposed walking control system.

88 citations

Proceedings ArticleDOI
05 Mar 2012
TL;DR: A model for generating head tilting and nodding is proposed and it is found that an upwards motion of a robot's face can be used by robots which do not have a mouth in order to provide the appearance that utterance is taking place.
Abstract: Head motion occurs naturally and in synchrony with speech during human dialogue communication, and may carry paralinguistic information, such as intentions, attitudes and emotions. Therefore, natural-looking head motion by a robot is important for smooth human-robot interaction. Based on rules inferred from analyses of the relationship between head motion and dialogue acts, this paper proposes a model for generating head tilting and nodding, and evaluates the model using three types of humanoid robot (a very human-like android, "Geminoid F", a typical humanoid robot with less facial degrees of freedom, "Robovie R2", and a robot with a 3-axis rotatable neck and movable lips, "Telenoid R2"). Analysis of subjective scores shows that the proposed model including head tilting and nodding can generate head motion with increased naturalness compared to nodding only or directly mapping people's original motions without gaze information. We also find that an upwards motion of a robot's face can be used by robots which do not have a mouth in order to provide the appearance that utterance is taking place. Finally, we conduct an experiment in which participants act as visitors to an information desk attended by robots. As a consequence, we verify that our generation model performs equally to directly mapping people's original motions with gaze information in terms of perceived naturalness.

88 citations

Proceedings ArticleDOI
10 Apr 2007
TL;DR: The developed hand has four fingers with 17 joints, which consist of 13 active joints and 4 linked joints, and a miniaturized 6-axes force sensor is newly developed and is mounted on each fingertip for improving the manipulability.
Abstract: This paper presents a development of multi-fingered hand, which is modularized and can be attached to life-size humanoid robots. The developed hand has four fingers with 17 joints, which consist of 13 active joints and 4 linked joints. A miniaturized 6-axes force sensor is newly developed and is mounted on each fingertip for improving the manipulability. A main node controller with I/O, motor drivers, and amplifiers for 6-axes force sensors are also newly developed. These components are equipped in the hand for modularization. The developed hand is designed so as to realize about 8 [N] forces on the pad point of stretched finger, supposing transmission efficiency of drive system is 55 [%]. In this paper, the mechanisms of hand module, its specifications, and electrical system are also introduced.

88 citations


Network Information
Related Topics (5)
Mobile robot
66.7K papers, 1.1M citations
96% related
Robot
103.8K papers, 1.3M citations
95% related
Adaptive control
60.1K papers, 1.2M citations
84% related
Control theory
299.6K papers, 3.1M citations
83% related
Object detection
46.1K papers, 1.3M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023253
2022759
2021573
2020647
2019801
2018921