Tactile-Based Self-supervised Pose Estimation for Robust Grasping

Book Chapter (2021)
Author(s)

Padmaja Kulkarni (TU Delft - Learning & Autonomous Control)

J. Kober (TU Delft - Learning & Autonomous Control)

Robert Babuska (TU Delft - Learning & Autonomous Control)

Research Group
Learning & Autonomous Control
DOI related publication
https://doi.org/10.1007/978-3-030-71151-1_25
More Info
expand_more
Publication Year
2021
Language
English
Research Group
Learning & Autonomous Control
Pages (from-to)
277-284
ISBN (print)
978-3-030-71150-4
ISBN (electronic)
978-3-030-71151-1

Abstract

We consider the problem of estimating an object’s pose in the absence of visual feedback after contact with robotic fingers during grasping has been made. Information about the object’s pose facilitates precise placement of the object after a successful grasp. If the grasp fails, then knowing the pose of the object after the grasping attempt is made can also help re-grasp the object. We develop a data-driven approach using tactile data that computes the object pose in a self-supervised manner after the object-finger contact is established. Additionally, we evaluate the effects of various feature representations, machine learning algorithms, and object properties on the pose estimation accuracy. Unlike other existing approaches, our method does not require any prior knowledge about the object and does not make any assumptions about grasp stability. In experiments, we show that our approach can estimate object poses with at least 2 cm translational and 20 rotational accuracy despite changed object properties and unsuccessful grasps.

No files available

Metadata only record. There are no files for this record.