Hybrid Convolutional Neural Network Approaches for Recognizing Collaborative Actions in Human–Robot Assembly Tasks
Zenggui Gao,
Ruining Yang,
Kai Zhao,
Wenhua Yu,
Zheng Liu and
Lilan Liu ()
Additional contact information
Zenggui Gao: Shanghai Key Laboratory of Intelligent Manufacturing and Robotics, Shanghai University, Shanghai 200444, China
Ruining Yang: Shanghai Key Laboratory of Intelligent Manufacturing and Robotics, Shanghai University, Shanghai 200444, China
Kai Zhao: Beijing Huahang Radio Measurement Research Institute, Beijing 102445, China
Wenhua Yu: Beijing Huahang Radio Measurement Research Institute, Beijing 102445, China
Zheng Liu: Design Innovation Center of CAA, China Academy of Art, Hangzhou 310024, China
Lilan Liu: Shanghai Key Laboratory of Intelligent Manufacturing and Robotics, Shanghai University, Shanghai 200444, China
Sustainability, 2023, vol. 16, issue 1, 1-18
Abstract:
In the context of sustainable manufacturing, efficient collaboration between humans and machines is crucial for improving assembly quality and efficiency. However, traditional methods for action recognition and human–robot collaborative assembly often face challenges such as low efficiency, low accuracy, and poor robustness. To solve such problems, this paper proposes an assembly action-recognition method based on a hybrid convolutional neural network. Firstly, an assembly action-recognition model is proposed using skeletal sequences and a hybrid convolutional neural network model combining Spatial Temporal Graph Convolutional Networks (ST-GCNs) and One-Dimensional Convolutional Neural Networks (1DCNNs) to sense and recognize human behavior actions during the assembly process. This model combines the joint spatial relationship and temporal information extraction ability of the ST-GCN model with the temporal feature extraction ability of the 1DCNN model. By incorporating Batch Normalization (BN) layers and Dropout layers, the generalization performance of the model is enhanced. Secondly, the model is validated on a self-constructed dataset of assembly actions, and the results show that the recognition accuracy of the model can reach 91.7%, demonstrating its superiority. Finally, a digital workshop application system based on digital twins is developed. To test the effectiveness of the proposed method, three sets of control experiments were designed to evaluate both objective and subjective aspects and verify the feasibility of the method presented in this paper. Compared with traditional assembly systems, the proposed method optimizes the recognition of human–robot collaborative assembly actions and applies them to intelligent control systems using digital-twin technology. This intelligent assembly method improves assembly efficiency and saves assembly time. It enables efficient and sustainable collaboration between humans and robots in assembly, leading to a positive and sustainable impact on the manufacturing industry.
Keywords: human–robot collaboration; assembly action recognition; ST-GCN; digital twins; intelligent manufacturing (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2023
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2071-1050/16/1/139/pdf (application/pdf)
https://www.mdpi.com/2071-1050/16/1/139/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:16:y:2023:i:1:p:139-:d:1305761
Access Statistics for this article
Sustainability is currently edited by Ms. Alexandra Wu
More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().