Learning Precise, Contact-Rich Manipulation through Uncalibrated Tactile Skins

Authors: Venkatesh Pattabiraman, Yifeng Cao, Siddhant Haldar, Lerrel Pinto, Raunaq Bhirangi

Abstract: While visuomotor policy learning has advanced robotic manipulation, precisely
executing contact-rich tasks remains challenging due to the limitations of
vision in reasoning about physical interactions. To address this, recent work
has sought to integrate tactile sensing into policy learning. However, many
existing approaches rely on optical tactile sensors that are either restricted
to recognition tasks or require complex dimensionality reduction steps for
policy learning. In this work, we explore learning policies with magnetic skin
sensors, which are inherently low-dimensional, highly sensitive, and
inexpensive to integrate with robotic platforms. To leverage these sensors
effectively, we present the Visuo-Skin (ViSk) framework, a simple approach that
uses a transformer-based policy and treats skin sensor data as additional
tokens alongside visual information. Evaluated on four complex real-world tasks
involving credit card swiping, plug insertion, USB insertion, and bookshelf
retrieval, ViSk significantly outperforms both vision-only and optical tactile
sensing based policies. Further analysis reveals that combining tactile and
visual modalities enhances policy performance and spatial generalization,
achieving an average improvement of 27.5% across tasks.
https://visuoskin.github.io/

Source: http://arxiv.org/abs/2410.17246v1

About the Author

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like these