Back to Search Start Over

Touch and Go: Learning from Human-Collected Vision and Touch

Authors :
Yang, Fengyu
Ma, Chenyang
Zhang, Jiacheng
Zhu, Jing
Yuan, Wenzhen
Owens, Andrew
Publication Year :
2022

Abstract

The ability to associate touch with sight is essential for tasks that require physically interacting with objects in the world. We propose a dataset with paired visual and tactile data called Touch and Go, in which human data collectors probe objects in natural environments using tactile sensors, while simultaneously recording egocentric video. In contrast to previous efforts, which have largely been confined to lab settings or simulated environments, our dataset spans a large number of "in the wild" objects and scenes. To demonstrate our dataset's effectiveness, we successfully apply it to a variety of tasks: 1) self-supervised visuo-tactile feature learning, 2) tactile-driven image stylization, i.e., making the visual appearance of an object more consistent with a given tactile signal, and 3) predicting future frames of a tactile signal from visuo-tactile inputs.<br />Comment: Accepted by NeurIPS 2022 Track of Datasets and Benchmarks

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.12498
Document Type :
Working Paper