Multimodal Emotion Cognition Method Based on Multi-Channel Graphic Interaction
Baisheng Zhong
Additional contact information
Baisheng Zhong: Guangzhou College of Technology and Business, China
International Journal of Cognitive Informatics and Natural Intelligence (IJCINI), 2024, vol. 18, issue 1, 1-17
Abstract:
The relationship between the emotional components associated with images and text is a crucial way of multimodal emotion analysis. However, most of the present multimodel affective cognitive models simply associate the features of images and texts without thoroughly investigating their interactions, resulting in poor recognition. Therefore, a multimodel emotion cognition method based on multi-channel graphic interaction is proposed. Text context features are extracted, scene and image information is encoded, and useful features are obtained. Based on these results, the modal alignment module be applied to obtain information about affective regions and words, and then the cross-modal gating module be applied to combine the multimodel features. In addition, we tested extensively on three open datasets, achieving an accuracy of 0.8122 for the MSA-single dataset, 0.7307 for the MSA-MULTIPLE dataset, and 0.7159 for TumEmo. The results show that this method is effective for multimodal emotion detection.
Date: 2024
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/IJCINI.349969 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:igg:jcini0:v:18:y:2024:i:1:p:1-17
Access Statistics for this article
International Journal of Cognitive Informatics and Natural Intelligence (IJCINI) is currently edited by Kangshun Li
More articles in International Journal of Cognitive Informatics and Natural Intelligence (IJCINI) from IGI Global
Bibliographic data for series maintained by Journal Editor ().