Classification Improvement with Integration of Radial Basis Function and Multilayer Perceptron Network Architectures

The radial basis function architecture and the multilayer perceptron architecture are very different approaches to neural networks in theory and practice. Considering their classification efficiency, both have different strengths; thus, the integration of these tools is an interesting but understudi...

Full description

Saved in:
Bibliographic Details
Main Author: László Kovács
Format: Article
Language:English
Published: MDPI AG 2025-04-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/13/9/1471
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The radial basis function architecture and the multilayer perceptron architecture are very different approaches to neural networks in theory and practice. Considering their classification efficiency, both have different strengths; thus, the integration of these tools is an interesting but understudied problem domain. This paper presents a novel initialization method based on a distance-weighted homogeneity measure to construct a radial basis function network with fast convergence. The proposed radial basis function network is utilized in the development of an integrated RBF-MLP architecture. The proposed neural network model was tested in various classification tasks and the test results show superiority of the proposed architecture. The RBF-MLP model achieved nearly 40 percent better accuracy in the tests than the baseline MLP or RBF neural network architectures.
ISSN:2227-7390