A Comparison of the Black Hole Algorithm Against Conventional Training Strategies for Neural Networks
Artificial Intelligence continues to demand robust and adaptable training methods for neural networks, particularly in scenarios involving limited computational resources or noisy, complex data. This study presents a comparative analysis of four training algorithms, Backpropagation, Genetic Algorith...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-07-01
|
| Series: | Mathematics |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2227-7390/13/15/2416 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Artificial Intelligence continues to demand robust and adaptable training methods for neural networks, particularly in scenarios involving limited computational resources or noisy, complex data. This study presents a comparative analysis of four training algorithms, Backpropagation, Genetic Algorithm, Black-hole Algorithm, and Particle Swarm Optimization, evaluated across both classification and regression tasks. Each method was implemented from scratch in MATLAB ver. R2024a, avoiding reliance on pre-optimized libraries to isolate algorithmic behavior. Two types of datasets were used, namely a synthetic benchmark dataset and a real-world dataset preprocessed into classification and regression formats. All algorithms were tested in both basic and advanced forms using consistent network architectures and training constraints. Results indicate that while Backpropagation maintained strong performance in smooth regression settings, the Black-hole and PSO algorithms demonstrated more stable and faster initial progress in noisy or discrete classification tasks. These findings highlight the practical viability of the Black-hole Algorithm as a competitive, gradient-free alternative for neural network training, particularly in early-stage learning or hybrid optimization frameworks. |
|---|---|
| ISSN: | 2227-7390 |