Back to Search Start Over

Visual–Tactile Fusion for Object Recognition.

Authors :
Liu, Huaping
Yu, Yuanlong
Sun, Fuchun
Gu, Jason
Source :
IEEE Transactions on Automation Science & Engineering; Apr2017, Vol. 14 Issue 2, p996-1008, 13p
Publication Year :
2017

Abstract

The camera provides rich visual information regarding objects and becomes one of the most mainstream sensors in the automation community. However, it is often difficult to be applicable when the objects are not visually distinguished. On the other hand, tactile sensors can be used to capture multiple object properties, such as textures, roughness, spatial features, compliance, and friction, and therefore provide another important modality for the perception. Nevertheless, effective combination of the visual and tactile modalities is still a challenging problem. In this paper, we develop a visual–tactile fusion framework for object recognition tasks. This paper uses the multivariate-time-series model to represent the tactile sequence and the covariance descriptor to characterize the image. Further, we design a joint group kernel sparse coding (JGKSC) method to tackle the intrinsically weak pairing problem in visual–tactile data samples. Finally, we develop a visual–tactile data set, composed of 18 household objects for validation. The experimental results show that considering both visual and tactile inputs is beneficial and the proposed method indeed provides an effective strategy for fusion. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15455955
Volume :
14
Issue :
2
Database :
Complementary Index
Journal :
IEEE Transactions on Automation Science & Engineering
Publication Type :
Academic Journal
Accession number :
122420377
Full Text :
https://doi.org/10.1109/TASE.2016.2549552