1. Learning suction graspability considering grasp quality and robot reachability for bin-picking
- Author
-
Jiang, Ping, Oaki, Junji, Ishihara, Yoshiyuki, Ooga, Junichiro, Han, Haifeng, Sugahara, Atsushi, Tokura, Seiji, Eto, Haruna, Komoda, Kazuma, and Ogawa, Akihito
- Subjects
Computer Science - Robotics ,Computer Science - Machine Learning - Abstract
Deep learning has been widely used for inferring robust grasps. Although human-labeled RGB-D datasets were initially used to learn grasp configurations, preparation of this kind of large dataset is expensive. To address this problem, images were generated by a physical simulator, and a physically inspired model (e.g., a contact model between a suction vacuum cup and object) was used as a grasp quality evaluation metric to annotate the synthesized images. However, this kind of contact model is complicated and requires parameter identification by experiments to ensure real world performance. In addition, previous studies have not considered manipulator reachability such as when a grasp configuration with high grasp quality is unable to reach the target due to collisions or the physical limitations of the robot. In this study, we propose an intuitive geometric analytic-based grasp quality evaluation metric. We further incorporate a reachability evaluation metric. We annotate the pixel-wise grasp quality and reachability by the proposed evaluation metric on synthesized images in a simulator to train an auto-encoder--decoder called suction graspability U-Net++ (SG-U-Net++). Experiment results show that our intuitive grasp quality evaluation metric is competitive with a physically-inspired metric. Learning the reachability helps to reduce motion planning computation time by removing obviously unreachable candidates. The system achieves an overall picking speed of 560 PPH (pieces per hour)., Comment: 18 pages, 2 tables, 7 figures
- Published
- 2021
- Full Text
- View/download PDF