Churn Prediction via Multimodal Fusion Learning: Integrating Customer Financial Literacy, Voice, and Behavioral Data
Prédiction du churn par apprentissage fusionné multimodal: intégration de la littératie financière, des données vocales et comportementales des clients
David Hason Rudd,
Huan Huo (),
Md. Rafiqul Islam and
Guandong Xu ()
Additional contact information
David Hason Rudd: UTS - University of Technology Sydney
Huan Huo: UTS - University of Technology Sydney
Md. Rafiqul Islam: UTS - University of Technology Sydney
Guandong Xu: UTS - University of Technology Sydney
Post-Print from HAL
Abstract:
In today's competitive landscape, businesses grapple with customer retention. Churn prediction models, although beneficial, often lack accuracy due to the reliance on a single data source. The intricate nature of human behavior and highdimensional customer data further complicate these efforts. To address these concerns, this paper proposes a multimodal fusion learning model for identifying customer churn risk levels in financial service providers. Our multimodal approach integrates customer sentiments, financial literacy (FL) level, and financial behavioral data, enabling more accurate and bias-free churn prediction models. The proposed FL model utilizes a SMOGN-COREG supervised model to gauge customer FL levels from their financial data. The baseline churn model applies an ensemble artificial neural network and oversampling techniques to predict churn propensity in high-dimensional financial data. We also incorporate a speech emotion recognition model employing a pretrained CNN-VGG16 to recognize customer emotions based on pitch, energy, and tone. To integrate these diverse features while retaining unique insights, we introduced late and hybrid fusion techniques that complementary boost coordinated multimodal colearning. Robust metrics were utilized to evaluate the proposed multimodal fusion model and hence the approach's validity, including mean average precision and macro-averaged F1 score. Our novel approach demonstrates a marked improvement in churn prediction, achieving a test accuracy of 91.2%, a Mean Average Precision (MAP) score of 66, and a Macro-Averaged F1 score of 54 through the proposed hybrid fusion learning technique compared with late fusion and baseline models. Furthermore, the analysis demonstrates a positive correlation between negative emotions, low FL scores, and high-risk customers.
Keywords: Churn prediction multimodal learning feature fusion financial literacy speech emotion recognition customer behavior; Churn prediction; multimodal learning; feature fusion; financial literacy; speech emotion recognition; customer behavior (search for similar items in EconPapers)
Date: 2023-10-30
New Economics Papers: this item is included in nep-big and nep-fle
Note: View the original document on HAL open archive server: https://hal.science/hal-04320145
References: View references in EconPapers View complete reference list from CitEc
Citations:
Published in The International Conference on Behavioural and Social Computing, Cyprus University, Oct 2023, Larnaca, Cyprus
Downloads: (external link)
https://hal.science/hal-04320145/document (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hal:journl:hal-04320145
Access Statistics for this paper
More papers in Post-Print from HAL
Bibliographic data for series maintained by CCSD ().