Realtime State Estimation with Tactile and Visual Sensing
We propose a realtime state estimation framework based on iSAM.
Paper: Kuan-Ting Yu and Alberto Rodriguez. Realtime State Estimation with Tactile and Visual Sensing. Application to Planar Manipulation. Submitted to ICRA 2018 (preprint).
Here are the ROS bag files recorded for the experiment. Each is about 13 GB and 50 sec long, containing the video stream from the viewer camera (the view from the above experiments) and observer camera (for detecting Apriltag):
Note: sharing this code is to provide implementation details to facilitate understanding the method and reproducing the result. We have not worked on making it user-friendly. Please let us know if you face any issue.
Related publications
KT. Yu, M. Bauza, N. Fazeli, and A. Rodriguez. "More than a Million Ways to Be Pushed: A High-Fidelity Experimental Dataset of Planar Pushing", IROS 2016. [PDF]
KT. Yu, J. Leonard, and A. Rodriguez. "Shape and Pose Recovery from Planar Pushing", IROS 2015. [PDF]