We are interested in the process by which robots sense contact information, aggregate it over time, and infer object properties like friction, mass, shape, texture, pose, or change. Vision provides global but noisy information, while tactile sensing gives accurate but local information. Our work aims at fusing them.
Real-time integration of vision and touch. We develop estimation algorithms to integrate visual and tactile information in real-time to track the pose of a manipulated object and its interaction with the environment. We are developing a framework for on-line contact-aware state estimation inspired by tools from the SLAM community, and have demonstrated its use in a pushing task and an insertion task.