Stay tuned! To be filled with more content soon.
Teaser Image

Humans seemingly incorporate potential touch signals in their perception. Our goal is to equip robots with a similar capability, which we term PseudoTouch. PseudoTouch aims to predict the expected touch signal based on a visual patch representing the touched area. We frame this problem as the task of learning a low-dimensional visual-tactile embedding, wherein we encode a depth patch from which we decode the tactile signal. To accomplish this task, we employ ReSkin, an inexpensive and replaceable magnetic-based tactile sensor. Using ReSkin, we collect and train PseudoTouch on a dataset comprising aligned tactile and visual data pairs obtained through random touching of eight basic geometric shapes. We demonstrate the efficacy of PseudoTouch through its application to two downstream tasks: object recognition and grasp stability prediction. In the object recognition task, we evaluate the learned embedding’s performance on a set of five basic geometric shapes and five household objects. Using PseudoTouch, we achieve an object recognition accuracy 84% after just ten touches, surpassing a proprioception baseline. For the grasp stability task, we use ACRONYM labels to train and evaluate a grasp success predictor using PseudoTouch’s predictions derived from virtual depth information. Our approach yields an impressive 32% absolute improvement in accuracy compared to the baseline relying on partial point cloud data.

Video

Code

This work is released under under the GPLv3 license. For any commercial purpose, please contact the authors. A software implementation of this project can soon be found on GitHub.

Publications

If you find our work useful, please consider citing our full paper:

Adrian Röfer*, Nick Heppert*, Abdallah Ayad, Eugenio Chisari, Abhinav Valada
PseudoTouch: Efficiently Imaging the Surface Feel of Objects for Robotic Manipulation.
Under review, 2024.

(PDF) (BibTeX)

This work is based on Abdallah Ayad's Master Thesis:

Abdallah Ayad, Adrian Röfer, Nick Heppert, Abhinav Valada
Imagine2Touch: Predictive Tactile Sensing for Robotic Manipulation using Efficient Low-Dimensional Signals.
ViTac 2024: Robot Embodiment through Visuo-Tactile Perception, ICRA Workshop, 2024.

(PDF) (BibTeX)

Authors

Adrian Röfer

Adrian Röfer*

University of Freiburg

Nick Heppert

Nick Heppert*

University of Freiburg

Abdallah Ayad

Abdallah Ayad

University of Freiburg

Eugenio Chisari

Eugenio Chisari

University of Freiburg

Abhinav Valada

Abhinav Valada

University of Freiburg

*equal contribution

Acknowledgment

This work was funded by the Carl Zeiss Foundation with the ReScaLe project and the BrainLinks-BrainTools center of the University of Freiburg.