Learning Tactile Models for Factor Graph-Based Estimation


We address the problem of estimating object poses from touch measurements during manipulation tasks such as planar pushing. Vision-based tactile sensors like DIGIT, GelSight provide rich, local image measurements at the point of contact. Existing approaches use either learning-only methods or engineered functions within optimization methods to interpret these measurements, both of which have limitations. We propose a two-stage approach: first we learn local tactile observation models supervised with ground truth data, and then integrate these models along with physics and geometric factors within a factor graph optimizer. We demonstrate reliable object tracking using only tactile feedback for ∼150 real-world planar pushing sequences with varying trajectories across three object shapes.


              title={Learning Tactile Models for Factor Graph-based Estimation},
              author={Sodhi, Paloma and Kaess, Michael and Mukadam, Mustafa and Anderson, Stuart},
              booktitle=IEEE Intl. Conf. on Robotics and Automation (ICRA),