scispace - formally typeset
Search or ask a question

Showing papers on "Humanoid robot published in 1995"


Proceedings ArticleDOI
05 Aug 1995
TL;DR: It is proposed that for natural tasks, zero motion force bandwidth isn't everything, and incorporating series elasticity as a purposeful element within the actuator is a good idea.
Abstract: It is traditional to make the interface between an actuator and its load as stiff as possible. Despite this tradition, reducing interface stiffness offers a number of advantages, including greater shock tolerance, lower reflected inertia, more accurate and stable force control, less inadvertent damage to the environment, and the capacity for energy storage. As a trade-off, reducing interface stiffness also lowers zero motion force bandwidth. In this paper, the authors propose that for natural tasks, zero motion force bandwidth isn't everything, and incorporating series elasticity as a purposeful element within the actuator is a good idea. The authors use the term elasticity instead of compliance to indicate the presence of a passive mechanical spring in the actuator. After a discussion of the trade-offs inherent in series elastic actuators, the authors present a control system for their use under general force or impedance control. The authors conclude with test results from a revolute series-elastic actuator meant for the arms of the MIT humanoid robot Cog and for a small planetary rover.

2,309 citations


Proceedings ArticleDOI
05 Aug 1995
TL;DR: Research on a two-armed bipedal robot, an apelike robot, which can perform biped walking, rolling over and standing up, which is designed based on the remote-brained approach in which a robot does not bring its own brain within the body and talks with it by radio links.
Abstract: Focusing attention on flexibility and intelligent reactivity in the real world, it is more important to build, not a robot that won't fall down, but a robot that can get up if it does fall down. This paper presents research on a two-armed bipedal robot, an apelike robot, which can perform biped walking, rolling over and standing up. The robot consists of a head, two arms, and two legs. The control system of the biped robot is designed based on the remote-brained approach in which a robot does not bring its own brain within the body and talks with it by radio links. This remote-brained approach enables a robot to have both a heavy brain with powerful computation and a lightweight body with multiple joints. The robot can keep balance while standing using tracking vision, detect whether it falls down or not by a set of vertical sensors, and perform a getting up motion by coordinating two arms and two legs. The developed system and experimental results are described with illustrated real examples.

65 citations


01 Jun 1995
TL;DR: An integrated auditory system for a humanoid robot that will, among other things, learn to localize normal, everyday sounds in a realistic environment and a neural network that has been developed off-line to learn to integrate the various auditory cues.
Abstract: : Localizing sounds with different frequency and time domain characteristics in a dynamic listening environment is a challenging task that has not been explored in the field of robotics as much as other perceptual tasks. This thesis presents an integrated auditory system for a humanoid robot, currently under development, that will, among other things, learn to localize normal, everyday sounds in a realistic environment. The hardware and software has been designed and developed to take full advantage of the features and capabilities of the humanoid robot of which it will be an integral component. Sounds with different frequency components and time domain characteristics have to be localized using different cues; a neural network is also presented that has been developed off-line to learn to integrate the various auditory cues, using primarily visual data to perform self-supervised training.

57 citations


Proceedings ArticleDOI
05 Aug 1995
TL;DR: This paper focuses on a hand-over motion as an example of cooperative work between a human and a robot, and proposes an algorithm which enables a robot to perform a human-like motion.
Abstract: In the future, robots may perform cooperative tasks with humans in daily life In this paper, the authors focus on a hand-over motion as an example of cooperative work between a human and a robot, and propose an algorithm which enables a robot to perform a human-like motion First the authors analyze trajectories and velocity patterns of a hand-over motion performed by two humans The experimental results show that a receiver's motion during hand-over has some typical characteristics The authors then confirm that a human-like motion can be generated using these characteristics Finally, the authors plan the robot's motion considering these results Initially, two kinds of potential fields are used to generate a motion command which leads the robot along a trajectory similar to that followed by the human In addition, more precise motion is considered at the end of the hand-over operation to guarantee accurate positioning and to soften the shock of contact Simulation results show the validity of the proposed method

43 citations


Proceedings ArticleDOI
22 Oct 1995
TL;DR: The performance of facial expression recognition by neural network and the expressionability of facial messages on face robot are investigated and it is found that the NN recognition of facial expressions and face robot's performance in generating facial expressions are of almost same level as that in humans.
Abstract: We are attempting to introduce a 3-dimensional, realistic human-like face robot to human-computer communication modality. The face robot can recognize human facial expressions as well as produce more realistic facial expressions. We propose a new concept of "Active Human Interface"; and as the first step, we investigate the performance of facial expression recognition by neural network (NN) and the expressionability of facial messages on face robot. We find that the NN recognition of facial expressions and face robot's performance in generating facial expressions are of almost same level as that in humans. This implies a high potential in the use of face robot for human-computer communication media.

21 citations


Proceedings ArticleDOI
05 Jul 1995
TL;DR: It is confirmed that coordinated motions of the eye and head system are possible with both motion and velocity equivalent to those of human.
Abstract: On the Humanoid Project, at the Waseda University, we are developing a 'campus information assistant Hadaly', which provides campus information services. This paper describes the anthropomorphic head-eye system which has an eye and head mechanism that comprises a subsystem of the campus information assistant Hadaly. The head-eye system consists of an eyeball mechanism and a head. Since the camera drive mechanism needs light-weight and immune to backlashes, the eyeball part uses a tendon-driven gimbal mechanism. Experiments were performed for the system to look at the target on the side, placed in the visual field of the CCD camera, and also to pursue a moving target within the visual field of the CCD camera. After performing the above experiments, we confirm that coordinated motions of the eye and head system are possible with both motion and velocity equivalent to those of human.

12 citations


Proceedings ArticleDOI
05 Aug 1995
TL;DR: To clarify the mechanism of rapid jaw motion, the authors focused on the nonlinearity of the human muscle that is known in the field of the physiology or biomechanisms and proposed a feasible mathematical model.
Abstract: Describes mathematical models that simulate the nonlinearity of the human muscle, and the results of a real food chewing experiment by a mastication robot. When the lower jaw rapidly closes, it may come in hard contact with the upper jaw if the food is a crushable one. To clarify the mechanism of rapid jaw motion, the authors focused on the nonlinearity of the human muscle that is known in the field of the physiology or biomechanisms. The authors propose a feasible mathematical model for the muscle and its nonlinearity. A nonlinear spring mechanism is then designed based on the mathematical model. As a result of chewing experiment, the authors confirmed control of the rapid closing motion of the robot jaw using the nonlinear spring mechanism. This work was done as part of the "Humanoid Project" at HUREL (Humanoid Research Laboratory).

11 citations


Proceedings ArticleDOI
20 Mar 1995
TL;DR: The fuzzy-neural controller is introduced for robot manipulator force control in an unknown environment using fuzzy logic in order to realize a human-like control and modeled as a neural network to adjust membership functions and rules to achieve the desired force control.
Abstract: Fuzzy-neural control, the combination of neural networks control which has a learning ability from experiments and fuzzy control which has an ability of dealing with human knowledge, has recently been studied in order to make up for each other's weak points. In this paper, the fuzzy-neural controller is introduced for robot manipulator force control in an unknown environment. A robot manipulator force controller is designed using fuzzy logic in order to realize a human-like control and then modeled as a neural network to adjust membership functions and rules to achieve the desired force control. Errors between the desired force and measured force and momentum of robot manipulator are used as input signals of the controller. Simulation has done to confirm the effectiveness of the controller. >

11 citations


Dissertation
01 Jan 1995
TL;DR: This thesis describes the instantiation of one such model on the humanoid robot Cog, developed at the MIT Artificial Intelligence Laboratory, in the form of an adaptive mapping of head movements into anticipated motion in the visual field.
Abstract: The cerebellum has long been known to be associated with the coordination of the human motor system. Contemporary research indicates that this is one product of the cerebellum's true function: the generation of dynamic models of systems both within and without the body. This thesis describes the instantiation of one such model on the humanoid robot Cog, developed at the MIT Artificial Intelligence Laboratory. The model takes the form of an adaptive mapping of head movements into anticipated motion in the visual field. This model is part of a visual subsystem which allows Cog to detect motion in the environment without being confused by movement of its own head. The author hopes that this work will be the first step in creating a generalized system for generating models between sensorimotor systems, and that such a system will be the first step in the development of a fully-functional artificial cerebellum for Cog. Thesis Supervisor: Rodney A. Brooks Title: Professor of Electrical Engineering and Computer Science