Synchronization in Fractional-Order Delayed Non-Autonomous Neural Networks

Neural networks, mimicking the structural and functional aspects of the human brain, have found widespread applications in diverse fields such as pattern recognition, control systems, and information processing. A critical phenomenon in these systems is synchronization, where multiple neurons or neu...

Full description

Saved in:
Bibliographic Details
Main Authors: Dingping Wu, Changyou Wang, Tao Jiang
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/13/7/1048
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Neural networks, mimicking the structural and functional aspects of the human brain, have found widespread applications in diverse fields such as pattern recognition, control systems, and information processing. A critical phenomenon in these systems is synchronization, where multiple neurons or neural networks harmonize their dynamic behaviors to a common rhythm, contributing significantly to their efficient operation. However, the inherent complexity and nonlinearity of neural networks pose significant challenges in understanding and controlling this synchronization process. In this paper, we focus on the synchronization of a class of fractional-order, delayed, and non-autonomous neural networks. Fractional-order dynamics, characterized by their ability to capture memory effects and non-local interactions, introduce additional layers of complexity to the synchronization problem. Time delays, which are ubiquitous in real-world systems, further complicate the analysis by introducing temporal asynchrony among the neurons. To address these challenges, we propose a straightforward yet powerful global synchronization framework. Our approach leverages novel state feedback control to derive an analytical formula for the synchronization controller. This controller is designed to adjust the states of the neural networks in such a way that they converge to a common trajectory, achieving synchronization. To establish the asymptotic stability of the error system, which measures the deviation between the states of the neural networks, we construct a Lyapunov function. This function provides a scalar measure of the system’s energy, and by showing that this measure decreases over time, we demonstrate the stability of the synchronized state. Our analysis yields sufficient conditions that guarantee global synchronization in fractional-order neural networks with time delays and Caputo derivatives. These conditions provide a clear roadmap for designing neural networks that exhibit robust and stable synchronization properties. To validate our theoretical findings, we present numerical simulations that demonstrate the effectiveness of our proposed approach. The simulations show that, under the derived conditions, the neural networks successfully synchronize, confirming the practical applicability of our framework.
ISSN:2227-7390