Project: State Estimation and Inference for Manipulation
We are interested in the process by which robots sense contact information, aggregate it over time, and infer object properties like friction, mass, shape, texture, pose, or change. Vision provides global but noisy information, while tactile sensing gives accurate but local information. Our work aims at fusing them.
Real-time integration of vision and touch. We develop estimation algorithms to integrate visual and tactile information in real-time to track the pose of a manipulated object and its interaction with the environment. We are developing a framework for on-line contact-aware state estimation inspired by tools from the SLAM community, and have demonstrated its use in a pushing task and an insertion task.
Identification of physical properties from contact. We have studied the identifiability of physical properties (e.g., inertia, coefficient of friction, contact forces) directly form observing interaction trajectories.
Contact: Nima Fazeli
Related Publications
2018 IROS "Realtime State Estimation with Tactile and Visual Sensing for Inserting a Suction-held Object", K.T. Yu and A. Rodriguez. [Bibtex]
2018 ICRA "Realtime State Estimation with Tactile and Visual sensing. Application to Planar Manipulation", K.T. Yu and A. Rodriguez. [Bibtex]
2018 PhD "Realtime State Estimation for Contact Manipulation", K. Yu. [Bibtex]
2017 IJRR "Parameter and Contact Force Estimation of Planar Rigid-Bodies Undergoing Frictional Contact", N. Fazeli, R. Kolbert, R. Tedrake, and A. Rodriguez. [Bibtex]
2015 ISRR "Identifiability Analysis of Rigid Body Frictional Contact", N. Fazeli, R. Tedrake, A. Rodriguez. [Bibtex]
2015 IROS "Shape and Pose Recovery from Planar Pushing", KT. Yu, J. Leonard, and A. Rodriguez. [Bibtex]
Related Videos
IROS 2018 - State Estimation for Inserting a Suction-held Object
ICRA 2018 - Realtime State Estimation with tactile and visual sensing
IROS 2015 - Shape and Pose Recovery from Planar Pushing