EconPapers    
Economics at your fingertips  
 

EmoMAC: a bias-induced multimodal fusion model for emotional analysis with visualization analytics enabled through super affective computing in emails

C. Pabitha (), K. Revathi, W. Gracy Theresa, Pornpimol Chawengsaksopark and Mithileysh Sathiyanarayanan
Additional contact information
C. Pabitha: SRM Valliammai Engineering College
K. Revathi: SRM Valliammai Engineering College
W. Gracy Theresa: Panimalar Engineering College
Pornpimol Chawengsaksopark: Shinawatra University
Mithileysh Sathiyanarayanan: Shinawatra University

Journal of Combinatorial Optimization, 2025, vol. 50, issue 3, No 3, 48 pages

Abstract: Abstract Email communication is used by around 87% of the business communities for internal and external communication. The detection of emotion in email communication along with visual analytics is a frontier laden with potential but still it has its challenges. They face challenges such as measuring effect of emotional cues, lack of capability for visualizations, integrating seamlessly into existing platforms and implementation of multimodal analysis. To overcome these challenges, this paper introduces a multimodal architecture known as EmoMAC that will help in analysing emotions in emails. By incorporating the multimodal data of the MELD dataset, it enhances the comprehensive interpretation of emotional dynamics. For textual analysis, the proposed model includes DelighT transformer with attention scaling and multi head attention incorporated into the model. For the extraction of dynamic visual features from videos, Dynamic Spatio Temporal Feature Pyramid Network with Pyramid Pooling (DSTFP) is used. Other features such as the sender-receiver relationship, the time stamps, which are obtained from the contextual schemes can easily be incorporated by the Bias Induced Sparse Hierarchical Attention Module (BiSHAM) which utilizes a bias aware attention module for feature fusion. For versatility in new task or data EmoMAC utilizes MAML algorithm for adaptability. Using Meaningful Neural Network (MNN), EmoMAC integrates text, image, and contextual data for emotion detection within emails. Rigorous evaluation accuracy of 90.10%, precision of 95.23%, recall of 91.65%, F1 score of 92.45%, and wa-F1 score of 88.47% validates EmoMAC’s efficacy in capturing emotional nuances and provides insights for visual analytics of emotions within email.

Keywords: EmoMAC; DelighT; DSTFP; BiSHAM; Emotion analysis (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s10878-025-01356-6 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:jcomop:v:50:y:2025:i:3:d:10.1007_s10878-025-01356-6

Ordering information: This journal article can be ordered from
https://www.springer.com/journal/10878

DOI: 10.1007/s10878-025-01356-6

Access Statistics for this article

Journal of Combinatorial Optimization is currently edited by Thai, My T.

More articles in Journal of Combinatorial Optimization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-09-28
Handle: RePEc:spr:jcomop:v:50:y:2025:i:3:d:10.1007_s10878-025-01356-6