Hierarchical Information-guided robotic grasp detection.

Journal: Scientific reports
Published Date:

Abstract

With the advancement of deep learning, robotic grasping has seen widespread application in fields, becoming a critical component in enhancing automation. Accurate and efficient grasping capabilities not only significantly boost productivity but also ensure safety and reliability in complex and dynamic environments. However, current approaches, particularly those based on convolutional neural networks (CNNs), often neglect the hierarchical information inherent in the data and lead to challenges in complex environments with abundant background information. Moreover, these methods struggle to capture long-range dependencies and non-local self-similarity, critical for accurate grasp detection. To address these issues, we propose GraspFormer, a novel method for robotic grasp detection. GraspFormer features a unique Encoder-Decoder framework that incorporates a Grasp Transformer Block designed to model long-range dependencies while avoiding background interference. Our approach also designs hierarchical information-guided self-attention (HIGSA) and an adaptive deep channel modulator (DCM) to enhance feature interactions and competition. Extensive experiments demonstrate that GraspFormer achieves performance comparable to state-of-the-art methods. The code is available at https://github.com/shine793/Hierarchical-Information-guided-Robotic-Grasp-Detection .

Authors

  • Zeyao Hou
    Tianjin University of Science and Technology, Tianjin, 300222, China.
  • Yueran Zhao
    Tianjin Bonus Robotics Technology Co., Ltd., Tianjin, China.
  • Yutao Jin
    Tianjin University of Science and Technology, Tianjin, 300222, China.
  • Chao Yang
    Translational Institute for Cancer Pain, Chongming Hospital Affiliated to Shanghai University of Health & Medicine Sciences (Xinhua Hospital Chongming Branch), Shanghai 202155, P. R. China.
  • Zongyu He
    Tianjin College, University of Science and Technology Beijing, Tianjin, 301830, China.
  • Xiaoyan Chen

Keywords

No keywords available for this article.