EconPapers    
Economics at your fingertips  
 

EEG Emotion Recognition Applied to the Effect Analysis of Music on Emotion Changes in Psychological Healthcare

Tie Hua Zhou, Wenlong Liang, Hangyu Liu, Ling Wang (), Keun Ho Ryu and Kwang Woo Nam
Additional contact information
Tie Hua Zhou: Department of Computer Science and Technology, School of Computer Science, Northeast Electric Power University, Jilin 132000, China
Wenlong Liang: Department of Computer Science and Technology, School of Computer Science, Northeast Electric Power University, Jilin 132000, China
Hangyu Liu: Department of Computer Science and Technology, School of Computer Science, Northeast Electric Power University, Jilin 132000, China
Ling Wang: Department of Computer Science and Technology, School of Computer Science, Northeast Electric Power University, Jilin 132000, China
Keun Ho Ryu: Data Science Laboratory, Faculty of Information Technology, Ton Duc Thang University, Ho Chi Minh City 700000, Vietnam
Kwang Woo Nam: Department of Computer and Information Engineering, Kunsan National University, Gunsan 54150, Republic of Korea

IJERPH, 2022, vol. 20, issue 1, 1-20

Abstract: Music therapy is increasingly being used to promote physical health. Emotion semantic recognition is more objective and provides direct awareness of the real emotional state based on electroencephalogram (EEG) signals. Therefore, we proposed a music therapy method to carry out emotion semantic matching between the EEG signal and music audio signal, which can improve the reliability of emotional judgments, and, furthermore, deeply mine the potential influence correlations between music and emotions. Our proposed EER model (EEG-based Emotion Recognition Model) could identify 20 types of emotions based on 32 EEG channels, and the average recognition accuracy was above 90% and 80%, respectively. Our proposed music-based emotion classification model (MEC model) could classify eight typical emotion types of music based on nine music feature combinations, and the average classification accuracy was above 90%. In addition, the semantic mapping was analyzed according to the influence of different music types on emotional changes from different perspectives based on the two models, and the results showed that the joy type of music video could improve fear, disgust, mania, and trust emotions into surprise or intimacy emotions, while the sad type of music video could reduce intimacy to the fear emotion.

Keywords: EEG signals; emotion recognition; music therapy; semantic analysis (search for similar items in EconPapers)
JEL-codes: I I1 I3 Q Q5 (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/1660-4601/20/1/378/pdf (application/pdf)
https://www.mdpi.com/1660-4601/20/1/378/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jijerp:v:20:y:2022:i:1:p:378-:d:1015678

Access Statistics for this article

IJERPH is currently edited by Ms. Jenna Liu

More articles in IJERPH from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jijerp:v:20:y:2022:i:1:p:378-:d:1015678