Authors: Hongyu Li, Snehal Dikhale, Jinda Cui, Soshi Iba, Nawid Jamali
Abstract: To achieve dexterity comparable to that of humans, robots must intelligently
process tactile sensor data. Taxel-based tactile signals often have low
spatial-resolution, with non-standardized representations. In this paper, we
propose a novel framework, HyperTaxel, for learning a geometrically-informed
representation of taxel-based tactile signals to address challenges associated
with their spatial resolution. We use this representation and a contrastive
learning objective to encode and map sparse low-resolution taxel signals to
high-resolution contact surfaces. To address the uncertainty inherent in these
signals, we leverage joint probability distributions across multiple
simultaneous contacts to improve taxel hyper-resolution. We evaluate our
representation by comparing it with two baselines and present results that
suggest our representation outperforms the baselines. Furthermore, we present
qualitative results that demonstrate the learned representation captures the
geometric features of the contact surface, such as flatness, curvature, and
edges, and generalizes across different objects and sensor configurations.
Moreover, we present results that suggest our representation improves the
performance of various downstream tasks, such as surface classification, 6D
in-hand pose estimation, and sim-to-real transfer.
Source: http://arxiv.org/abs/2408.08312v1