A recurrent sigma pi sigma neural network
Abstract In this paper, a novel recurrent sigma‒sigma neural network (RSPSNN) that contains the same advantages as the higher-order and recurrent neural networks is proposed. The batch gradient algorithm is used to train the RSPSNN to search for the optimal weights based on the minimal mean squared...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-024-84299-y |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841559604936310784 |
---|---|
author | Fei Deng Shibin Liang Kaiguo Qian Jing Yu Xuanxuan Li |
author_facet | Fei Deng Shibin Liang Kaiguo Qian Jing Yu Xuanxuan Li |
author_sort | Fei Deng |
collection | DOAJ |
description | Abstract In this paper, a novel recurrent sigma‒sigma neural network (RSPSNN) that contains the same advantages as the higher-order and recurrent neural networks is proposed. The batch gradient algorithm is used to train the RSPSNN to search for the optimal weights based on the minimal mean squared error (MSE). To substantiate the unique equilibrium state of the RSPSNN, the characteristic of stability convergence is proven, which is one of the most significant indices for reflecting the effectiveness and overcoming the instability problem in the training of this network. Finally, to establish a more precise evaluation of its validity, five empirical experiments are used. The RSPSNN is successfully applied to the function approximation problem, prediction problem, parity problem, classification problem, and image simulation, which verifies its effectiveness and practicability. |
format | Article |
id | doaj-art-7c2fbda0718e4a1db837f8b4329a96e8 |
institution | Kabale University |
issn | 2045-2322 |
language | English |
publishDate | 2025-01-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj-art-7c2fbda0718e4a1db837f8b4329a96e82025-01-05T12:20:46ZengNature PortfolioScientific Reports2045-23222025-01-0115111410.1038/s41598-024-84299-yA recurrent sigma pi sigma neural networkFei Deng0Shibin Liang1Kaiguo Qian2Jing Yu3Xuanxuan Li4College of Information Engineering, Kunming UniversityYunnan Electric Power Test and Research Institute Group Co., LtdCollege of Information Engineering, Kunming UniversityCollege of Information Engineering, Kunming UniversityCollege of Information Engineering, Kunming UniversityAbstract In this paper, a novel recurrent sigma‒sigma neural network (RSPSNN) that contains the same advantages as the higher-order and recurrent neural networks is proposed. The batch gradient algorithm is used to train the RSPSNN to search for the optimal weights based on the minimal mean squared error (MSE). To substantiate the unique equilibrium state of the RSPSNN, the characteristic of stability convergence is proven, which is one of the most significant indices for reflecting the effectiveness and overcoming the instability problem in the training of this network. Finally, to establish a more precise evaluation of its validity, five empirical experiments are used. The RSPSNN is successfully applied to the function approximation problem, prediction problem, parity problem, classification problem, and image simulation, which verifies its effectiveness and practicability.https://doi.org/10.1038/s41598-024-84299-y |
spellingShingle | Fei Deng Shibin Liang Kaiguo Qian Jing Yu Xuanxuan Li A recurrent sigma pi sigma neural network Scientific Reports |
title | A recurrent sigma pi sigma neural network |
title_full | A recurrent sigma pi sigma neural network |
title_fullStr | A recurrent sigma pi sigma neural network |
title_full_unstemmed | A recurrent sigma pi sigma neural network |
title_short | A recurrent sigma pi sigma neural network |
title_sort | recurrent sigma pi sigma neural network |
url | https://doi.org/10.1038/s41598-024-84299-y |
work_keys_str_mv | AT feideng arecurrentsigmapisigmaneuralnetwork AT shibinliang arecurrentsigmapisigmaneuralnetwork AT kaiguoqian arecurrentsigmapisigmaneuralnetwork AT jingyu arecurrentsigmapisigmaneuralnetwork AT xuanxuanli arecurrentsigmapisigmaneuralnetwork AT feideng recurrentsigmapisigmaneuralnetwork AT shibinliang recurrentsigmapisigmaneuralnetwork AT kaiguoqian recurrentsigmapisigmaneuralnetwork AT jingyu recurrentsigmapisigmaneuralnetwork AT xuanxuanli recurrentsigmapisigmaneuralnetwork |