Machine learning for DCO-OFDM based LiFi
Krishna Saha Purnita and
M Rubaiyat Hossain Mondal
PLOS ONE, 2021, vol. 16, issue 11, 1-14
Abstract:
Light fidelity (LiFi) uses different forms of orthogonal frequency division multiplexing (OFDM), including DC biased optical OFDM (DCO-OFDM). In DCO-OFDM, the use of a large DC bias causes optical power inefficiency, while a small bias leads to higher clipping noise. Hence, finding an appropriate DC bias level for DCO-OFDM is important. This paper applies machine learning (ML) algorithms to find optimum DC-bias value for DCO-OFDM based LiFi systems. For this, a dataset is generated for DCO-OFDM using MATLAB tool. Next, ML algorithms are applied using Python programming language. ML is used to find the important attributes of DCO-OFDM that influence the optimum DC bias. It is shown here that the optimum DC bias is a function of several factors including, the minimum, the standard deviation, and the maximum value of the bipolar OFDM signal, and the constellation size. Next, linear and polynomial regression algorithms are successfully applied to predict the optimum DC bias value. Results show that polynomial regression of order 2 can predict the optimum DC bias value with a coefficient of determination of 96.77% which confirms the effectiveness of the prediction.
Date: 2021
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0259955 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 59955&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0259955
DOI: 10.1371/journal.pone.0259955
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().