CAttSleepNet: Automatic End-to-End Sleep Staging Using Attention-Based Deep Neural Networks on Single-Channel EEG
Tingting Li,
Bofeng Zhang,
Hehe Lv,
Shengxiang Hu,
Zhikang Xu and
Yierxiati Tuergong
Additional contact information
Tingting Li: School of Computer Engineering and Science, Shanghai University, Shanghai 200444, China
Bofeng Zhang: School of Computer and Communication Engineering, Shanghai Polytechnic University, Shanghai 201209, China
Hehe Lv: School of Computer Engineering and Science, Shanghai University, Shanghai 200444, China
Shengxiang Hu: School of Computer Engineering and Science, Shanghai University, Shanghai 200444, China
Zhikang Xu: School of Computer Engineering and Science, Shanghai University, Shanghai 200444, China
Yierxiati Tuergong: School of Computer Science and Technology, Kashi University, Kashi 844008, China
IJERPH, 2022, vol. 19, issue 9, 1-15
Abstract:
Accurate sleep staging results can be used to measure sleep quality, providing a reliable basis for the prevention and diagnosis of sleep-related diseases. The key to sleep staging is the feature representation of EEG signals. Existing approaches rarely consider local features in feature extraction, and fail to distinguish the importance of critical and non-critical local features. We propose an innovative model for automatic sleep staging with single-channel EEG, named CAttSleepNet. We add an attention module to the convolutional neural network (CNN) that can learn the weights of local sequences of EEG signals by exploiting intra-epoch contextual information. Then, a two-layer bidirectional-Long Short-Term Memory (Bi-LSTM) is used to encode the global correlations of successive epochs. Therefore, the feature representations of EEG signals are enhanced by both local and global context correlation. Experimental results achieved on two real-world sleep datasets indicate that the CAttSleepNet model outperforms existing models. Moreover, ablation experiments demonstrate the validity of our proposed attention module.
Keywords: sleep staging; convolutional neural network; attention mechanism; bidirectional long short-term memory; EEG (search for similar items in EconPapers)
JEL-codes: I I1 I3 Q Q5 (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.mdpi.com/1660-4601/19/9/5199/pdf (application/pdf)
https://www.mdpi.com/1660-4601/19/9/5199/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jijerp:v:19:y:2022:i:9:p:5199-:d:801548
Access Statistics for this article
IJERPH is currently edited by Ms. Jenna Liu
More articles in IJERPH from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().