Back to Search Start Over

Visual-tactile Fusion for Transparent Object Grasping in Complex Backgrounds

Authors :
Li, Shoujie
Yu, Haixin
Ding, Wenbo
Liu, Houde
Ye, Linqi
Xia, Chongkun
Wang, Xueqian
Zhang, Xiao-Ping
Li, Shoujie
Yu, Haixin
Ding, Wenbo
Liu, Houde
Ye, Linqi
Xia, Chongkun
Wang, Xueqian
Zhang, Xiao-Ping
Publication Year :
2022

Abstract

The accurate detection and grasping of transparent objects are challenging but of significance to robots. Here, a visual-tactile fusion framework for transparent object grasping under complex backgrounds and variant light conditions is proposed, including the grasping position detection, tactile calibration, and visual-tactile fusion based classification. First, a multi-scene synthetic grasping dataset generation method with a Gaussian distribution based data annotation is proposed. Besides, a novel grasping network named TGCNN is proposed for grasping position detection, showing good results in both synthetic and real scenes. In tactile calibration, inspired by human grasping, a fully convolutional network based tactile feature extraction method and a central location based adaptive grasping strategy are designed, improving the success rate by 36.7% compared to direct grasping. Furthermore, a visual-tactile fusion method is proposed for transparent objects classification, which improves the classification accuracy by 34%. The proposed framework synergizes the advantages of vision and touch, and greatly improves the grasping efficiency of transparent objects.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381586534
Document Type :
Electronic Resource