Advancing Spiking Neural Networks: Bio-Inspired Learning and Efficient Spatio-Temporal Processing

The field of spiking neural networks (SNNs) is witnessing significant advancements, particularly in the areas of bio-inspired learning algorithms and efficient spatio-temporal data processing. Recent developments emphasize the integration of fractional-order calculus into gradient descent methods, enhancing the learning capabilities of SNNs by better mimicking biological neural dynamics. This approach has shown substantial improvements in classification accuracy across various datasets, marking a notable shift towards more biologically plausible and computationally efficient models. Additionally, there is a growing focus on the creation and utilization of high-quality neuromorphic datasets that better leverage the spatio-temporal capabilities of SNNs, facilitating advancements in cross-modality fusion and attention mechanisms. These datasets are crucial for pushing the boundaries of SNN performance in tasks such as human action recognition, where event-based cameras offer unique advantages. Furthermore, novel gradient-free learning methods are emerging, providing alternatives to traditional backpropagation techniques and demonstrating scalability to deeper network architectures. These innovations collectively underscore a trend towards more sophisticated, efficient, and biologically inspired SNN models, poised to advance the field significantly.

Sources

Fractional-order spike-timing-dependent gradient descent for multi-layer spiking neural networks

Enhancing SNN-based Spatio-Temporal Learning: A Benchmark Dataset and Cross-Modality Attention Model

Gradient-Free Supervised Learning using Spike-Timing-Dependent Plasticity for Image Recognition

SpikMamba: When SNN meets Mamba in Event-based Human Action Recognition

Neuronal Competition Groups with Supervised STDP for Spike-Based Classification

Research on gesture recognition method based on SEDCNN-SVM

Spatial-Temporal Search for Spiking Neural Networks

Built with on top of