Active Dendrites Enable Efficient Continual Learning in Time-To-First-Spike Neural Networks

More Info
expand_more

Abstract

While the human brain efficiently adapts to new tasks from a continuous stream of information, neural network models struggle to learn from sequential information without catastrophically forgetting previously learned tasks. This limitation presents a significant hurdle in deploying edge devices in real-world scenarios where information is presented in an inherently sequential manner. Active dendrites of pyramidal neurons play an important role in the brain's ability to learn new tasks incrementally. By exploiting key properties of time-to-first-spike (TTFS) encoding and leveraging its high sparsity, we present a novel spiking neural network (SNN) model enhanced with active dendrites. Our model can efficiently mitigate catastrophic forgetting in temporally-encoded SNNs, which we demonstrate with an end-of-training accuracy across tasks of 88.3% on the test set using the Split MNIST dataset. Furthermore, we provide a novel digital hardware architecture that paves the way for real-world deployment in edge devices. Using a Xilinx Zynq-7020 SoC FPGA, we demonstrate a 100-% match with our quantized software model, achieving an average inference time of 37.3 ms and an 80.0% accuracy.

Files

Active_Dendrites_Enable_Effici... (pdf)
(pdf | 4.73 Mb)
warning

File under embargo until 20-01-2025