Feedback models for real-time robotic cloth manipulation
Oriol Barbany (Ph.D. Student)
Object rigidity is still one of the most common assumptions in robotic grasping and manipulation. Nevertheless, many daily life objects like cables, plastics, and clothes present non-negligible deformations. Manipulating garments remains a challenging topic due to the high flexibility of textiles and their nearly infinite number of degrees of freedom. Due to the intricate dynamics of clothes, the existing deformation models are still far from accurate when predicting future states after forces are applied, especially over long time horizons. For this reason, robots can leverage real-time feedback to update the belief of the manipulated garment state and adapt their actions accordingly. To extract cloth properties that may be useful to understand the object's behaviour, we propose to combine vision and proprioceptive sensory systems to capture the state of the cloth at each instant. While cameras provide global information about shapes, force and tactile sensors can provide local information about material properties like stiffness. To allow for real-world applications, the obtained cloth model should adapt to task and environment variations, be sample and computationally efficient, and deal with uncertainty.
|Primary Host:||Carme Torras (Universitat Politècnica de Catalunya)|
|Exchange Host:||Amir Zamir (EPFL)|
|PhD Duration:||01 July 2021 - 31 May 2024|
|Exchange Duration:||01 January 2023 - 31 May 2023 - Ongoing|