Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus
Ke-Lin Du,
Rengong Zhang,
Bingchun Jiang (),
Jie Zeng and
Jiabin Lu
Additional contact information
Ke-Lin Du: School of Mechanical and Electrical Engineering, Guangdong University of Science and Technology, Dongguan 523668, China
Rengong Zhang: Zhejiang Yugong Information Technology Co., Ltd., Changhe Road 475, Hangzhou 310002, China
Bingchun Jiang: School of Mechanical and Electrical Engineering, Guangdong University of Science and Technology, Dongguan 523668, China
Jie Zeng: Shenzhen Feng Xing Tai Bao Technology Co., Ltd., Shenzhen 518063, China
Jiabin Lu: Faculty of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China
Mathematics, 2025, vol. 13, issue 4, 1-49
Abstract:
Ensemble learning and data fusion techniques play a crucial role in modern machine learning, enhancing predictive performance, robustness, and generalization. This paper provides a comprehensive survey of ensemble methods, covering foundational techniques such as bagging, boosting, and random forests, as well as advanced topics including multiclass classification, multiview learning, multiple kernel learning, and the Dempster–Shafer theory of evidence. We present a comparative analysis of ensemble learning and deep learning, highlighting their respective strengths, limitations, and synergies. Additionally, we examine the theoretical foundations of ensemble methods, including bias–variance trade-offs, margin theory, and optimization-based frameworks, while analyzing computational trade-offs related to training complexity, inference efficiency, and storage requirements. To enhance accessibility, we provide a structured comparative summary of key ensemble techniques. Furthermore, we discuss emerging research directions, such as adaptive ensemble methods, hybrid deep learning approaches, and multimodal data fusion, as well as challenges related to interpretability, model selection, and handling noisy data in high-stakes applications. By integrating theoretical insights with practical considerations, this survey serves as a valuable resource for researchers and practitioners seeking to understand the evolving landscape of ensemble learning and its future prospects.
Keywords: ensemble learning; bagging; boosting; random forests; deep learning integration; multimodal data fusion (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/13/4/587/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/4/587/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:4:p:587-:d:1588218
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().