NG-Net: No-Grasp annotation grasp detection network for stacked scenes
Min Shi (),
Jingzhao Hou (),
Zhaoxin Li () and
Dengming Zhu ()
Additional contact information
Min Shi: North China Electric Power University
Jingzhao Hou: North China Electric Power University
Zhaoxin Li: Chinese Academy of Agricultural Sciences
Dengming Zhu: Chinese Academy of Sciences
Journal of Intelligent Manufacturing, 2025, vol. 36, issue 2, No 35, 1477-1490
Abstract:
Abstract Achieving a high grasping success rate in a stacked environment is the core of the robot’s grasping task. Most methods achieve a high grasping success rate by training the network on a dataset containing a large number of grasping annotations which requires a lot of manpower and material resources. Therefore, achieving a high grasping success rate for stacked scenes without grasping annotations is a challenging task. To address this, we propose a No-Grasp annotation grasp detection network for stacked scenes (NG-Net). Our network consists of two modules: an object selection module and a grasp generation module. Specifically, the object selection module performs instance segmentation on the raw point cloud to select the object with the highest score as the object to be grasped, and the grasp generation module uses mathematical methods to analyze the geometric features of the point cloud surface to achieve grasping pose generation without grasping annotations. Experiments show that on the modified IPA-Binpicking dataset G, NG-Net has an average grasp success rate of 97% in the stacked scene grasp experiment, 14–22% higher than PointNetGPD.
Keywords: Grasp detection; No-Grasp annotation; Stacked scenes; Robotic grasping (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10845-024-02321-6 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:joinma:v:36:y:2025:i:2:d:10.1007_s10845-024-02321-6
Ordering information: This journal article can be ordered from
http://www.springer.com/journal/10845
DOI: 10.1007/s10845-024-02321-6
Access Statistics for this article
Journal of Intelligent Manufacturing is currently edited by Andrew Kusiak
More articles in Journal of Intelligent Manufacturing from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().