Enhancing temporal learning in recurrent spiking networks for neuromorphic applications
Training Recurrent Spiking Neural Networks (RSNNs) with binary spikes for tasks of extended time scales presents a challenge due to the amplified vanishing gradient problem during back propagation through time. This paper introduces three crucial elements that significantly enhance the memory and ca...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IOP Publishing
2025-01-01
|
| Series: | Neuromorphic Computing and Engineering |
| Subjects: | |
| Online Access: | https://doi.org/10.1088/2634-4386/add293 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Training Recurrent Spiking Neural Networks (RSNNs) with binary spikes for tasks of extended time scales presents a challenge due to the amplified vanishing gradient problem during back propagation through time. This paper introduces three crucial elements that significantly enhance the memory and capabilities of RSNNs, with a strong emphasis on compatibility with hardware and neuromorphic systems. Firstly, we incorporate neuron-level synaptic delays, which not only allow the gradient to skip time steps but also reduce the overall neuron population’s firing rate. Subsequently, we apply a biologically inspired branching factor regularization rule to stabilize the network’s dynamics and make training easier by incorporating a time-local error in the loss function. Lastly, we modify a commonly used surrogate gradient function by increasing its support to facilitate learning over longer timescales when using binary spikes. By integrating these three innovative elements, we not only resolve several complex benchmarks but also achieve state-of-the-art results on the spiking permuted sequential MNIST task (psMNIST), showcasing the practicality and relevance of our approach for digital and analog neuromorphic systems. |
|---|---|
| ISSN: | 2634-4386 |