Back to Search
Start Over
Learning to Grasp Familiar Objects Based on Experience and Objects’ Shape Affordance
- Source :
- IEEE Transactions on Systems, Man, and Cybernetics: Systems. 49:2710-2723
- Publication Year :
- 2019
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2019.
-
Abstract
- Stably grasping objects for a specific task is a hot research topic in robotics due to multiple degrees of freedom of hand kinematics, various shapes of objects, and incomplete visual sensing of objects (partial point clouds). This paper proposes an effective grasp planning method by integrating the crucial grasp cues (positions and orientations of thumb fingertips and the wrist) from humans’ grasp experience. This approach has multiple advantages: greatly reducing the search space of the hand kinematics; no reconstruction or registration; being able to directly perform on the partial point cloud of objects. Meanwhile, for various shapes of objects which are partially observable in the single-view visual sensing, the presented approach learns the “thumb” grasp point employing a signature of histograms of orientations shape descriptor based on objects’ category level. This method recognizes the grasp point according to the shape affordance at each point on the object, which performs the grasp point generalization on the familiar objects. Finally, we verify the developed methods via both simulations and experiments by grasping various shapes of objects.
- Subjects :
- Computer science
business.industry
GRASP
Point cloud
Robotics
Kinematics
Object (computer science)
Computer Science Applications
Human-Computer Interaction
Control and Systems Engineering
Robot
Point (geometry)
Computer vision
Artificial intelligence
Electrical and Electronic Engineering
business
Affordance
Software
Subjects
Details
- ISSN :
- 21682232 and 21682216
- Volume :
- 49
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Systems, Man, and Cybernetics: Systems
- Accession number :
- edsair.doi...........2b862450908b3edcff9dd8b82003d8a0
- Full Text :
- https://doi.org/10.1109/tsmc.2019.2901955