Residual Connections in Spiking Neural Networks

Skipping deeper: Unveiling the Power of Residual Connections in Multi-Spiking Neural Networks

Master Thesis (2024)
Authors

A. de Los Santos Subirats (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Supervisors

Nergis Tömen (TU Delft - Pattern Recognition and Bioinformatics)

A. Micheli (TU Delft - Pattern Recognition and Bioinformatics)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2024
Language
English
Graduation Date
28-08-2024
Awarding Institution
Delft University of Technology
Programme
Computer Science
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

In recent years the emergence of Spiking Neural Net- works (SNNs) has shown that these networks are a promis- ing alternative to traditional Artificial Neural Networks (ANNs) due to their low-power computing capabilities and noise robustness. Nevertheless, in recent approaches, they have either strayed away from spike time backpropaga- tion, used discrete time, single spike neurons, and/or limited themselves to shallow networks. These approaches limit the potential of SNNs by sacrificing significant aspects of what makes them special in order to have a functional network. It is for this reason that we believe that, the implemen- tation of residual connections, allowing a model that has multi-spiking neurons, precise time and spike time back- propagation is the path to allow these networks to truly shine. As it will solve one of this model’s most severe lim- itations which prevent it from being used to build deeper networks, the banishing of spikes at a deeper depth. Hence we aim to remedy this problem by feeding inputs from fur- ther up the network to revitalize the spike counts at deeper layers. In this paper, we explore the implementation of residual connections in precise time multi-spiking neural networks as a way of solving the disappearing spikes problem that inhibits spike time backpropagation. We explore two al- ternatives in the implementation of the residual connection and we analyze how they affect the accuracy of the network as the depth increases in both Multi-Layer Perceptrons and Convolutional Neural Networks. We also developed an ar- chitecture to allow for swapping of the fuse function in or- der to take full advantage of the flexibility that precise time provides us. Results show that the implemented residual connections allow for deeper training, which has the poten- tial to aid in the network’s performance, although in some cases some hurdles remain.

Files

License info not available