Proceedings ArticleDOI
Planning optimal grasps
Carlo Ferrari,John Canny +1 more
- pp 2290-2295
TLDR
Two general optimality criteria that consider the total finger force and the maximum finger force are introduced and discussed and the geometric interpretation of the two criteria leads to an efficient planning algorithm.Abstract:
The authors address the problem of planning optimal grasps. Two general optimality criteria that consider the total finger force and the maximum finger force are introduced and discussed. Their formalization using various metrics on a space of generalized forces is detailed. The geometric interpretation of the two criteria leads to an efficient planning algorithm. An example of its use in a robotic environment equipped with two-jaw and three-jaw is described. >read more
Citations
More filters
Journal ArticleDOI
Deep learning for detecting robotic grasps
TL;DR: This work presents a two-step cascaded system with two deep networks, where the top detections from the first are re-evaluated by the second, and shows that this method improves performance on an RGBD robotic grasping dataset, and can be used to successfully execute grasps on two different robotic platforms.
Proceedings ArticleDOI
Robotic grasping and contact: a review
Antonio Bicchi,Vijay Kumar +1 more
TL;DR: This paper surveys the field of robotic grasping and the work that has been done in this area over the last two decades, with a slight bias toward the development of the theoretical framework and analytical results.
Journal ArticleDOI
Graspit! A versatile simulator for robotic grasping
A.T. Miller,Peter K. Allen +1 more
TL;DR: The different types of world elements and the general robot definition are discussed and the robot library is presented, and the grip analysis and visualization method were presented.
Journal ArticleDOI
Data-Driven Grasp Synthesis—A Survey
TL;DR: A review of the work on data-driven grasp synthesis and the methodologies for sampling and ranking candidate grasps and an overview of the different methodologies are provided, which draw a parallel to the classical approaches that rely on analytic formulations.
Proceedings Article
Deep Learning for Detecting Robotic Grasps
TL;DR: In this paper, a two-step cascaded system with two deep networks is proposed to detect robotic grasps in an RGB-D view of a scene containing objects, where the top detections from the first are re-evaluated by the second.
References
More filters
Book
On the existence and synthesis of multifinger positive grips
TL;DR: The criteria under which an object can be gripped by a multifingered dexterous hand, assuming no static friction between the object and the fingers is studied, and efficient algorithms to synthesize positive grips for bounded polyhedral/polygonal objects are presented.
Proceedings ArticleDOI
Constructing force-closure grasps
TL;DR: The force closure constraint is addressed from three different points of view: mathematics, physics, and computational geometry, and the last formulation results in fast and simple polynomial time algorithms for directly constructing force closure grasps.
Journal ArticleDOI
Automatic grasp planning in the presence of uncertainty
TL;DR: An algorithm for automatic planning of robot grasping motions that are insensitive to bounded uncer tainties in the object's location is presented and it is shown that simple squeeze- grasp operations are not sufficient for grasping all possible objects, and offset-grasp and push- Grasp Operations are added to increase the scope of the planner.
Journal ArticleDOI
Optimum grip of a polygon
TL;DR: It is shown that form closure of a polygon object can be achieved by four fingers (previous proofs were not complete), and the problem of finding the optimum stable grip or formclosure of any given polygon is solved.
Journal ArticleDOI
Quantitative Steinitz's theorems with applications to multifingered grasping
TL;DR: An upper bound on the achievable radius is shown: the residual radius must be less than % MathType!MTEF!2!1!+-% feaafiart1ev1aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn% hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDhar