Abstract
Robots in the real world frequently come across identical objects in dense clutter. When evaluating grasp poses in these scenarios, a target-driven grasping system requires knowledge of spatial relations between scene objects (e.g., proximity, adjacency, and occlusions). To efficiently complete this task, we propose a target-driven grasping system that simultaneously considers object relations and predicts 6-DoF grasp poses. A densely cluttered scene is first formulated as a grasp graph with nodes representing object geometries in the grasp coordinate frame and edges indicating spatial relations between the objects. We design a Grasp Graph Neural Network (G2N2) that evaluates the grasp graph and finds the most feasible 6-DoF grasp pose for a target object. Additionally, we develop a shape completion-assisted grasp pose sampling method that improves sample quality and consequently grasping efficiency. We compare our method against several baselines in both simulated and real settings. In real-world experiments with novel objects, our approach achieves a 77.78% grasping accuracy in densely cluttered scenarios, surpassing the best-performing baseline by more than 15%. Supplementary material is available at https://sites.google.com/umn.edu/graph-grasping.
Original language | English (US) |
---|---|
Title of host publication | 2022 IEEE International Conference on Robotics and Automation, ICRA 2022 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 742-748 |
Number of pages | 7 |
ISBN (Electronic) | 9781728196817 |
DOIs | |
State | Published - 2022 |
Event | 39th IEEE International Conference on Robotics and Automation, ICRA 2022 - Philadelphia, United States Duration: May 23 2022 → May 27 2022 |
Publication series
Name | 2022 International Conference on Robotics and Automation (ICRA) |
---|
Conference
Conference | 39th IEEE International Conference on Robotics and Automation, ICRA 2022 |
---|---|
Country/Territory | United States |
City | Philadelphia |
Period | 5/23/22 → 5/27/22 |
Bibliographical note
Funding Information:*This work was in part supported by the MnDRIVE Initiative on Robotics, Sensors, and Advanced Manufacturing. 1X. Lou and C. Choi are with the Department of Electrical and Computer Engineering, Univ. of Minnesota, Minneapolis, USA {lou00015, cchoi}@umn.edu 2Y. Yang is with the Department of Computer Science and Engineering, Univ. of Minnesota, Minneapolis, USA [email protected]
Publisher Copyright:
© 2022 IEEE.
Keywords
- Deep Learning in Grasping and Manipulation
- Grasping
- Perception for Grasping and Manipulation