Backpropagating in time-discretized multi-spike spiking neural networks

How are the training accuracy and training speed (in epochs and time) of a spiking neural network affected when numerically integrating with the forward-Euler and Parker-Sochacki methods?

Bachelor Thesis (2024)
Author(s)

M. Guichard (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Nergis Tömen – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

A. Micheli – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

O. Booij – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Evangelia Anna Markatou – Graduation committee member (TU Delft - Cyber Security)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2024
Language
English
Graduation Date
23-06-2024
Awarding Institution
Delft University of Technology
Project
CSE3000 Research Project
Programme
Computer Science and Engineering
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Spiking neural networks have gained traction as both a tool for neuroscience research and a new frontier in machine learning. A plethora of neuroscience literature exists exploring the realistic simulation of neurons, with complex models re- quiring the formulation and integration of ordinary differential equations. Overcoming this challenge has led to the exploration of various numerical integration techniques with the goal of highly stable and accurate simulations. In contrast, training spiking neural networks is often done with simple leaky integrate-and-fire models and rudimentary integration methods such as the forward-Euler method. In this research we explore how more complex numerical integration methods, borrowed from neuroscience research, affect the training of networks based on current-based leaky integrate- and-fire neurons. We derive equations required for the integration process and suggest the use of spike time interpolation. Furthermore, we pro- vide insights into applying backpropagation on numerically integrated networks and highlight possible pitfalls of the process. We conclude that numerically integrated networks can achieve training accuracies close to their theoretical limits, with good convergence and training time characteristics. Specifically, high order integrations achieve robust and computationally viable training. Additionally, we explore the effects of spike time interpolation on network accuracy and use our findings to pro- vide insights into the role of different integration parameters on the effective training of spiking neural networks.

Files

License info not available