EconPapers    
Economics at your fingertips  
 

A Comprehensive Interaction in Multiscale Multichannel EEG Signals for Emotion Recognition

Yiquan Guo, Bowen Zhang, Xiaomao Fan, Xiaole Shen and Xiaojiang Peng ()
Additional contact information
Yiquan Guo: College of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, China
Bowen Zhang: College of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, China
Xiaomao Fan: College of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, China
Xiaole Shen: College of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, China
Xiaojiang Peng: College of Big Data and Internet, Shenzhen Technology University, Shenzhen 518118, China

Mathematics, 2024, vol. 12, issue 8, 1-17

Abstract: Electroencephalogram (EEG) is the most preferred and credible source for emotion recognition, where long-short range features and a multichannel relationship are crucial for performance because numerous physiological components function at various time scales and on different channels. We propose a cascade scale-aware adaptive graph convolutional network and cross-EEG transformer (SAG-CET) to explore the comprehensive interaction between multiscale and multichannel EEG signals with two novel ideas. First, to model the relationship of multichannel EEG signals and enhance signal representation ability, the multiscale EEG signals are fed into a scale-aware adaptive graph convolutional network (SAG) before the CET model. Second, the cross-EEG transformer (CET), is used to explicitly capture multiscale features as well as their correlations. The CET consists of two self-attention encoders for gathering features from long-short time series and a cross-attention module to integrate multiscale class tokens. Our experiments show that CET significantly outperforms a vanilla unitary transformer, and the SAG module brings visible gains. Our methods also outperform state-of-the-art methods in subject-dependent tasks with 98.89%/98.92% in accuracy for valence/arousal on DEAP and 99.08%/99.21% on DREAMER.

Keywords: EEG classification; emotion recognition; multiscale feature; cross attention (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/12/8/1180/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/8/1180/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:8:p:1180-:d:1375831

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:12:y:2024:i:8:p:1180-:d:1375831