EconPapers    
Economics at your fingertips  
 

Wi-FiAG: Fine-Grained Abnormal Gait Recognition via CNN-BiGRU with Attention Mechanism from Wi-Fi CSI

Anming Dong, Jiahao Zhang, Wendong Xu, Jia Jia (), Shanshan Yun and Jiguo Yu ()
Additional contact information
Anming Dong: Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
Jiahao Zhang: School of Mathematics and Statistics, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
Wendong Xu: Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
Jia Jia: Shandong Zhengyun Information Technology Co., Ltd., Jinan 250000, China
Shanshan Yun: Shandong Zhengyun Information Technology Co., Ltd., Jinan 250000, China
Jiguo Yu: School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China

Mathematics, 2025, vol. 13, issue 8, 1-17

Abstract: Abnormal gait recognition, which aims to detect and identify deviations from normal walking patterns indicative of various health conditions or impairments, holds promising applications in healthcare and many other related fields. Currently, Wi-Fi-based abnormal gait recognition methods in the literature mainly distinguish the normal and abnormal gaits, which belongs to coarse-grained classification. In this work, we explore fine-grained gait rectification methods for distinguishing multiple classes of abnormal gaits. Specifically, we propose a deep learning-based framework for multi-class abnormal gait recognition, comprising three key modules: data collection, data preprocessing, and gait classification. For the gait classification module, we design a hybrid deep learning architecture that integrates convolutional neural networks (CNNs), bidirectional gated recurrent units (BiGRUs), and an attention mechanism to enhance performance. Compared to traditional CNNs, which rely solely on spatial features, or recurrent neural networks like long short-term memory (LSTM) and gated recurrent units (GRUs), which primarily capture temporal dependencies, the proposed CNN-BiGRU network integrates both spatial and temporal features concurrently. This dual-feature extraction capability positions the proposed CNN-BiGRU architecture as a promising approach for enhancing classification accuracy in scenarios involving multiple gaits with subtle differences in their characteristics. Moreover, the attention mechanism is employed to selectively focus on critical spatiotemporal features for fine-grained abnormal gait detection, enhancing the model’s sensitivity to subtle anomalies. We construct an abnormal gait dataset comprising seven distinct gait classes to train and evaluate the proposed network. Experimental results demonstrate that the proposed method achieves an average recognition accuracy of 95%, surpassing classical baseline models by at least 2%.

Keywords: deep learning; gait recognition; CNN; BiGRU; time and frequency features (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/8/1227/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/8/1227/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:8:p:1227-:d:1630758

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-04-10
Handle: RePEc:gam:jmathe:v:13:y:2025:i:8:p:1227-:d:1630758