Novel Approach towards a Fully Deep Learning-Based IoT Receiver Architecture: From Estimation to Decoding
Matthew Boeding,
Michael Hempel () and
Hamid Sharif
Additional contact information
Matthew Boeding: Department of Electrical and Computer Engineering, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
Michael Hempel: Department of Electrical and Computer Engineering, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
Hamid Sharif: Department of Electrical and Computer Engineering, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
Future Internet, 2024, vol. 16, issue 5, 1-13
Abstract:
As the Internet of Things (IoT) continues to expand, wireless communication is increasingly widespread across diverse industries and remote devices. This includes domains such as Operational Technology in the Smart Grid. Notably, there is a surge in resource-constrained devices leveraging wireless communication, especially with the advances of 5G/6G technology. Nevertheless, the transmission of wireless communications demands substantial power and computational resources, presenting a significant challenge to these devices and their operations. In this work, we propose the use of deep learning to improve the Bit Error Rate (BER) performance of Orthogonal Frequency Division Multiplexing (OFDM) wireless receivers. By improving the BER performance of these receivers, devices can transmit with less power, thereby improving IoT devices’ battery life. The architecture presented in this paper utilizes a depthwise Convolutional Neural Network (CNN) for channel estimation and demodulation, whereas a Graph Neural Network (GNN) is utilized for Low-Density Parity Check (LDPC) decoding, tested against a proposed (1998, 1512) LDPC code. Our results show higher performance than traditional receivers in both isolated tests for the CNN and GNN, and a combined end-to-end test with lower computational complexity than other proposed deep learning models. For BER improvement, our proposed approach showed a 1 dB improvement for eliminating BER in QPSK models. Additionally, it improved 16-QAM Rician BER by five decades, 16-QAM LOS model BER by four decades, 64-QAM Rician BER by 2.5 decades, and 64-QAM LOS model BER by three decades.
Keywords: IoT; 5G; operational technology; OFDM; receiver; deep learning; machine learning (search for similar items in EconPapers)
JEL-codes: O3 (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/1999-5903/16/5/155/pdf (application/pdf)
https://www.mdpi.com/1999-5903/16/5/155/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jftint:v:16:y:2024:i:5:p:155-:d:1386129
Access Statistics for this article
Future Internet is currently edited by Ms. Grace You
More articles in Future Internet from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().