Qian Feng ULTRACEPT presenting at IEEE International Conference on Robotics and Automation (ICRA)

Qian Feng: Centre-of-Mass-based Robust Grasp Planning for Unknown Objects, Using Tactile-Visual Sensors

Qian Feng is an external PhD student at the Technical University of Munich and working at project partner Agile Robots and contributing to ULTRACEPT’s Work Package 4.

The IEEE International Conference on Robotics and Automation (ICRA) is an annual academic conference covering advances in robotics. It is one of the premier conferences in its field, with an ‘A’ rating from the Australian Ranking of ICT Conferences obtained in 2010 and an ‘A1’ rating from the Brazilian ministry of education in 2012.

Qian Feng attended the IEEE International Conference on Robotics and Automation (ICRA) 2020. The conference was originally scheduled to take place in Paris, France, but due to COVID-19, the conference was held virtually from 31 May 2020 until 31 August 2020.

Qian Feng ULTRACEPT IEEE Conference
Qian Feng presenting online at ICRA 2020

Qian presented his conference paper:

Q. Feng, Z. Chen, J. Deng, C. Gao, J. Zhang and A. Knoll, Center-of-Mass-based Robust Grasp Planning for Unknown Objects Using Tactile-Visual Sensors,” 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 2020, pp. 610-617, doi: 10.1109/ICRA40945.2020.9196815.

Abstract

An unstable grasp pose can lead to slip, thus an unstable grasp pose can be predicted by slip detection. A re-grasp is required afterward in order to correct the grasp pose and finish the task. In this work, we propose a novel re-grasp planner with multi-sensor modules to plan grasp adjustments with the feedback from a slip detector. Then a re-grasp planner is trained to estimate the location of centre of mass, which helps robots find an optimal grasp pose. The dataset in this work consists of 1,025 slip experiments and 1,347 re-grasps collected by one pair of tactile sensors, an RGB-D camera, and one Franka Emika robot arm equipped with joint force/torque sensors. We show that our algorithm can successfully detect and classify the slip for 5 unknown test objects with an accuracy of 76.88% and a re-grasp planner increases the grasp success rate by 31.0%, compared to the state-of-the-art vision-based grasping algorithm.

Qian Feng ULTRACEPT IEEE Conference slip detector
Qian Feng: Slip Detector
Qian Feng ULTRACEPT IEEE Conference Grasp Success Rate on Test Objects
Qian Feng: Grasp Success Rate on Test Objects

 

When asked about his experience presenting and attending ICRA 2020, Qian said:

“Thanks to the virtual conference we were still able to present our work. It also meant that more people were able to join the conference to learn about and discuss our research. Everyone was able to access the presentation and get involved in the discussion in the virtual conference for 2 months, instead of the originally scheduled 5 minutes of discussion for the on-site conference. During this conference I shared my work with many researchers from the same field and exchanged ideas. I really enjoyed the conference and learnt a lot from the other attendees.”

One thought on “Qian Feng: Centre-of-Mass-based Robust Grasp Planning for Unknown Objects, Using Tactile-Visual Sensors”

  1. This is a very nice research result. I’ve been following the development of grasping methods in deep learning. I hope to get a chance to test my work on a real robot application someday.

Leave a Reply to Frank Whitman Cancel reply

Your email address will not be published. Required fields are marked *