Teacher-Assistant Knowledge Distillation Based Indoor Positioning System
Aqilah Binti Mazlan,
Yin Hoe Ng () and
Chee Keong Tan
Additional contact information
Aqilah Binti Mazlan: Faculty of Engineering, Multimedia University, Cyberjaya 63100, Malaysia
Yin Hoe Ng: Faculty of Engineering, Multimedia University, Cyberjaya 63100, Malaysia
Chee Keong Tan: School of Information Technology, Monash University Malaysia, Subang Jaya 47500, Malaysia
Sustainability, 2022, vol. 14, issue 21, 1-19
Abstract:
Indoor positioning systems have been of great importance, especially for applications that require the precise location of objects and users. Convolutional neural network-based indoor positioning systems (IPS) have garnered much interest in recent years due to their ability to achieve high positioning accuracy and low positioning error, regardless of signal fluctuation. Nevertheless, a powerful CNN framework comes with a high computational cost. Hence, there will be difficulty in deploying such a system on a computationally restricted device. Knowledge distillation has been an excellent solution which allows smaller networks to imitate the performance of larger networks. However, problems such as degradation in the student’s positioning performance, occur when a far more complex CNN is used to train a small CNN, because the small CNN does not have the ability to fully capture the knowledge that has been passed down. In this paper, we implemented the teacher-assistant framework to allow a simple CNN indoor positioning system to closely imitate a superior indoor positioning scheme. The framework involves transferring knowledge from a large pre-trained network to a small network by passing through an intermediate network. Based on our observation, the positioning error of a small network can be reduced to up to 38.79% by implementing the teacher-assistant knowledge distillation framework, while a typical knowledge distillation framework can only reduce the error to 30.18%.
Keywords: indoor positioning; received signal strength indicator; convolutional neural networks; knowledge distillation; teacher assistant (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2071-1050/14/21/14652/pdf (application/pdf)
https://www.mdpi.com/2071-1050/14/21/14652/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:14:y:2022:i:21:p:14652-:d:965779
Access Statistics for this article
Sustainability is currently edited by Ms. Alexandra Wu
More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().