KDTMD: Knowledge distillation for transportation mode detection based on KAN
Rui Li,
Xueyi Song and
Yongliang Xie
PLOS ONE, 2025, vol. 20, issue 6, 1-24
Abstract:
With the progress in sensor technology and the spread of mobile devices, transportation mode detection (TMD) is gaining importance for health and urban traffic improvements. As mobile devices become more lightweight, they require more efficient, low-power models to handle limited resources effectively. Despite extensive research on TMD, challenges remain in capturing non-stationary temporal dynamics and nonlinear fitting capabilities. Additionally, many existing models exhibit high space complexity, making lightweight deployment on devices with limited computing and memory resources difficult. To address these issues, we propose a novel deep TMD model based on discrete wavelet transform (DWT) and knowledge distillation (KD), called KDTMD. This model consists of two main modules, i.e., DWT and KD. For the DWT module, since non-stationary time variations and event distribution shifts complicate sensor time series analysis, we use the DWT modules to disentangle the sensor time series into two parts: a low-frequency part that indicates the trend and a high-frequency part that captures events. The separated trend data is less influenced by event distribution shifts, effectively mitigating the impact of non-stationary time variations. For the KD module, it includes the teacher model and student model. Specifically, for teacher model, to address the nonlinearities and interpretability, we incorporate T-KAN, which is composed of multiple layers of linear KAN that employ learnable B-spline functions to achieve a richer feature representation with fewer parameters. For student model, we develop the S-CNN, which is trained efficiently by T-KAN through KD. The KDTMD model achieves 97.27% accuracy and 97.29% F1-Score on the SHL dataset, and 96.56% accuracy and 96.72% F1-Score on the HTC dataset. Additionally, the parameters of the KDTMD model are only about 10% of the smallest baseline.
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0324752 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 24752&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0324752
DOI: 10.1371/journal.pone.0324752
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().