Dynamic scenario-enhanced diverse human motion prediction network for proactive human–robot collaboration in customized assembly tasks
Pengfei Ding,
Jie Zhang (),
Pai Zheng,
Peng Zhang,
Bo Fei and
Ziqi Xu
Additional contact information
Pengfei Ding: Donghua University
Jie Zhang: Donghua Universiy
Pai Zheng: The Hong Kong Polytechnic University
Peng Zhang: Donghua Universiy
Bo Fei: Donghua University
Ziqi Xu: Donghua Universiy
Journal of Intelligent Manufacturing, 2025, vol. 36, issue 7, No 7, 4593-4612
Abstract:
Abstract Human motion prediction is crucial for facilitating human–robot collaboration in customized assembly tasks. However, existing research primarily focuses on predicting limited human motions using static global information, which fails to address the highly stochastic nature of customized assembly operations in a given region. To address this, we propose a dynamic scenario-enhanced diverse human motion prediction network that extracts dynamic collaborative features to predict highly stochastic customized assembly operations. In this paper, we present a multi-level feature adaptation network that generates information for dynamically manipulating objects. This is accomplished by extracting multi-attribute features at different levels, including multi-channel gaze tracking, multi-scale object affordance detection, and multi-modal object’s 6 degree-of-freedom pose estimation. Notably, we employ gaze tracking to locate the collaborative space accurately. Furthermore, we introduce a multi-step feedback-refined diffusion sampling network specifically designed for predicting highly stochastic customized assembly operations. This network refines the outcomes of our proposed multi-weight diffusion sampling strategy to better align with the target distribution. Additionally, we develop a feedback regulatory mechanism that incorporates ground truth information in each prediction step to ensure the reliability of the results. Finally, the effectiveness of the proposed method was demonstrated through comparative experiments and validation of assembly tasks in a laboratory environment.
Keywords: Dynamic collaborative information; Diverse human motion prediction; Human–robot collaboration; Customized assembly (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10845-024-02462-8 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:joinma:v:36:y:2025:i:7:d:10.1007_s10845-024-02462-8
Ordering information: This journal article can be ordered from
http://www.springer.com/journal/10845
DOI: 10.1007/s10845-024-02462-8
Access Statistics for this article
Journal of Intelligent Manufacturing is currently edited by Andrew Kusiak
More articles in Journal of Intelligent Manufacturing from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().