Classification Improvement with Integration of Radial Basis Function and Multilayer Perceptron Network Architectures
László Kovács ()
Additional contact information
László Kovács: Institute of Informatics, University of Miskolc, H-3515 Miskolc, Hungary
Mathematics, 2025, vol. 13, issue 9, 1-25
Abstract:
The radial basis function architecture and the multilayer perceptron architecture are very different approaches to neural networks in theory and practice. Considering their classification efficiency, both have different strengths; thus, the integration of these tools is an interesting but understudied problem domain. This paper presents a novel initialization method based on a distance-weighted homogeneity measure to construct a radial basis function network with fast convergence. The proposed radial basis function network is utilized in the development of an integrated RBF-MLP architecture. The proposed neural network model was tested in various classification tasks and the test results show superiority of the proposed architecture. The RBF-MLP model achieved nearly 40 percent better accuracy in the tests than the baseline MLP or RBF neural network architectures.
Keywords: redial basis function; neural networks; parameter initialization; density-based entropy (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/13/9/1471/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/9/1471/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:9:p:1471-:d:1646174
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().