Backpropagating in time-discretized multi-spike spiking neural networks

How are the training accuracy and training speed (in epochs and time) of a spiking neural network affected when numerically integrating with the forward-Euler and Parker-Sochacki methods?

More Info
expand_more

Abstract

Spiking neural networks have gained traction as both a tool for neuroscience research and a new frontier in machine learning. A plethora of neuroscience literature exists exploring the realistic simulation of neurons, with complex models re- quiring the formulation and integration of ordinary differential equations. Overcoming this challenge has led to the exploration of various numerical integration techniques with the goal of highly stable and accurate simulations. In contrast, training spiking neural networks is often done with simple leaky integrate-and-fire models and rudimentary integration methods such as the forward-Euler method. In this research we explore how more complex numerical integration methods, borrowed from neuroscience research, affect the training of networks based on current-based leaky integrate- and-fire neurons. We derive equations required for the integration process and suggest the use of spike time interpolation. Furthermore, we pro- vide insights into applying backpropagation on numerically integrated networks and highlight possible pitfalls of the process. We conclude that numerically integrated networks can achieve training accuracies close to their theoretical limits, with good convergence and training time characteristics. Specifically, high order integrations achieve robust and computationally viable training. Additionally, we explore the effects of spike time interpolation on network accuracy and use our findings to pro- vide insights into the role of different integration parameters on the effective training of spiking neural networks.