scispace - formally typeset
Search or ask a question
Topic

GRASP

About: GRASP is a research topic. Over the lifetime, 5457 publications have been published within this topic receiving 112708 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The resulting taxonomy incorporates all grasps found in the reviewed taxonomies that complied with the grasp definition and is shown that due to the nature of the classification, the 33 grasp types might be reduced to a set of 17 more generalgrasps if only the hand configuration is considered without the object shape/size.
Abstract: In this paper, we analyze and compare existing human grasp taxonomies and synthesize them into a single new taxonomy (dubbed “The GRASP Taxonomy” after the GRASP project funded by the European Commission). We consider only static and stable grasps performed by one hand. The goal is to extract the largest set of different grasps that were referenced in the literature and arrange them in a systematic way. The taxonomy provides a common terminology to define human hand configurations and is important in many domains such as human–computer interaction and tangible user interfaces where an understanding of the human is basis for a proper interface. Overall, 33 different grasp types are found and arranged into the GRASP taxonomy. Within the taxonomy, grasps are arranged according to 1) opposition type, 2) the virtual finger assignments, 3) type in terms of power, precision, or intermediate grasp, and 4) the position of the thumb. The resulting taxonomy incorporates all grasps found in the reviewed taxonomies that complied with the grasp definition. We also show that due to the nature of the classification, the 33 grasp types might be reduced to a set of 17 more general grasps if only the hand configuration is considered without the object shape/size.

636 citations

Journal ArticleDOI
01 May 2019-Nature
TL;DR: Tactile patterns obtained from a scalable sensor-embedded glove and deep convolutional neural networks help to explain how the human hand can identify and grasp individual objects and estimate their weights.
Abstract: Humans can feel, weigh and grasp diverse objects, and simultaneously infer their material properties while applying the right amount of force—a challenging set of tasks for a modern robot1. Mechanoreceptor networks that provide sensory feedback and enable the dexterity of the human grasp2 remain difficult to replicate in robots. Whereas computer-vision-based robot grasping strategies3–5 have progressed substantially with the abundance of visual data and emerging machine-learning tools, there are as yet no equivalent sensing platforms and large-scale datasets with which to probe the use of the tactile information that humans rely on when grasping objects. Studying the mechanics of how humans grasp objects will complement vision-based robotic object handling. Importantly, the inability to record and analyse tactile signals currently limits our understanding of the role of tactile information in the human grasp itself—for example, how tactile maps are used to identify objects and infer their properties is unknown6. Here we use a scalable tactile glove and deep convolutional neural networks to show that sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight and explore the typical tactile patterns that emerge while grasping objects. The sensor array (548 sensors) is assembled on a knitted glove, and consists of a piezoresistive film connected by a network of conductive thread electrodes that are passively probed. Using a low-cost (about US$10) scalable tactile glove sensor array, we record a large-scale tactile dataset with 135,000 frames, each covering the full hand, while interacting with 26 different objects. This set of interactions with different objects reveals the key correspondences between different regions of a human hand while it is manipulating objects. Insights from the tactile signatures of the human grasp—through the lens of an artificial analogue of the natural mechanoreceptor network—can thus aid the future design of prosthetics7, robot grasping tools and human–robot interactions1,8–10. Tactile patterns obtained from a scalable sensor-embedded glove and deep convolutional neural networks help to explain how the human hand can identify and grasp individual objects and estimate their weights.

623 citations

Posted Content
TL;DR: In this article, a grasp quality convolutional neural network (GQ-CNN) is trained from a synthetic dataset of 6.7 million point clouds, grasps and analytic grasp metrics generated from thousands of 3D models from Dex-Net 1.0 in randomized poses on a table.
Abstract: To reduce data collection time for deep learning of robust robotic grasp plans, we explore training from a synthetic dataset of 6.7 million point clouds, grasps, and analytic grasp metrics generated from thousands of 3D models from Dex-Net 1.0 in randomized poses on a table. We use the resulting dataset, Dex-Net 2.0, to train a Grasp Quality Convolutional Neural Network (GQ-CNN) model that rapidly predicts the probability of success of grasps from depth images, where grasps are specified as the planar position, angle, and depth of a gripper relative to an RGB-D sensor. Experiments with over 1,000 trials on an ABB YuMi comparing grasp planning methods on singulated objects suggest that a GQ-CNN trained with only synthetic data from Dex-Net 2.0 can be used to plan grasps in 0.8sec with a success rate of 93% on eight known objects with adversarial geometry and is 3x faster than registering point clouds to a precomputed dataset of objects and indexing grasps. The Dex-Net 2.0 grasp planner also has the highest success rate on a dataset of 10 novel rigid objects and achieves 99% precision (one false positive out of 69 grasps classified as robust) on a dataset of 40 novel household objects, some of which are articulated or deformable. Code, datasets, videos, and supplementary material are available at this http URL .

582 citations

Journal ArticleDOI
TL;DR: The relation ship between the relative motion of two fingers grasping an object and the motion of the points of contact over the object surface is derived and the following applications are explored.
Abstract: The kinematics of contact describe the motion of a point of contact over the surfaces of two contacting objects in response to a relative motion of these objects. Using concepts from differential geometry, I derive a set of equations, called the contact equations, that embody this relationship. I employ the contact equations to design the following applications to be executed by an end-effector with tactile sensing capability: ( 1) determining the curvature form of an unknown object at a point of contact; and (2) following the surface of an unknown object. The contact equations also serve as a basis for an investigation of the kinematics of grasp. I derive the relation ship between the relative motion of two fingers grasping an object and the motion of the points of contact over the object surface. Based on this analysis, we explore the following applications: (1) rolling a sphere between two arbitrarily shaped fingers ; (2) fine grip adjustment ( i.e., having two fingers that grasp an unknown object loca...

545 citations

Proceedings Article
01 Dec 1989
TL;DR: Algorithms and methods are presented for animating the hands of a synthetic actor that allow the land to move and grasp objects; they also compute deformations of the hand as it moves.
Abstract: Algorithms and methods are presented for animating the hands of a synthetic actor. The algorithms allow the land to move and grasp objects; they also compute deformations of the hand as it moves. The mapping of surfaces onto the skeleton is based on the concept of joint-dependent local deformation (JLD) operators, which are specific local deformation operators depending on the nature of the joints. The major problem in the hand covering process is the calculation of the coordinate bases. The key to the method was to find a model for calculating the bases which was sophisticated enough for the simulation of complex motions. Several examples are shown and the user interface introduced into the human factory system is also described

516 citations


Network Information
Related Topics (5)
Robot
103.8K papers, 1.3M citations
77% related
Software
130.5K papers, 2M citations
77% related
Robustness (computer science)
94.7K papers, 1.6M citations
76% related
Artificial neural network
207K papers, 4.5M citations
76% related
Control system
129K papers, 1.5M citations
75% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
20231,054
20222,259
2021366
2020399
2019401