Pushing the limits of remote RF sensing by reading lips under the face mask
Hira Hameed,
Muhammad Usman,
Ahsen Tahir,
Amir Hussain,
Hasan Abbas,
Tie Jun Cui,
Muhammad Ali Imran and
Qammer H. Abbasi ()
Additional contact information
Hira Hameed: University of Glasgow, James Watt School of Engineering
Muhammad Usman: University of Glasgow, James Watt School of Engineering
Ahsen Tahir: University of Glasgow, James Watt School of Engineering
Amir Hussain: Edinburgh Napier University
Hasan Abbas: University of Glasgow, James Watt School of Engineering
Tie Jun Cui: Southeast University
Muhammad Ali Imran: University of Glasgow, James Watt School of Engineering
Qammer H. Abbasi: University of Glasgow, James Watt School of Engineering
Nature Communications, 2022, vol. 13, issue 1, 1-9
Abstract:
Abstract The problem of Lip-reading has become an important research challenge in recent years. The goal is to recognise speech from lip movements. Most of the Lip-reading technologies developed so far are camera-based, which require video recording of the target. However, these technologies have well-known limitations of occlusion and ambient lighting with serious privacy concerns. Furthermore, vision-based technologies are not useful for multi-modal hearing aids in the coronavirus (COVID-19) environment, where face masks have become a norm. This paper aims to solve the fundamental limitations of camera-based systems by proposing a radio frequency (RF) based Lip-reading framework, having an ability to read lips under face masks. The framework employs Wi-Fi and radar technologies as enablers of RF sensing based Lip-reading. A dataset comprising of vowels A, E, I, O, U and empty (static/closed lips) is collected using both technologies, with a face mask. The collected data is used to train machine learning (ML) and deep learning (DL) models. A high classification accuracy of 95% is achieved on the Wi-Fi data utilising neural network (NN) models. Moreover, similar accuracy is achieved by VGG16 deep learning model on the collected radar-based dataset.
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.nature.com/articles/s41467-022-32231-1 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-32231-1
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-022-32231-1
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().