Skip to the content.

Evolutionary algorithms support recurrent plasticity in spiking neural network models of neocortical task learning

Authors: Ivyer Mingwei Qu, Huaze Liu, Jiayue Dora Li, Yuqing Zhu

Presentation type: Poster at SNUFA 2024 online workshop (5-6 Nov 2024)

Abstract

Task-trained recurrent spiking neural networks (RSNNs) can provide insights into how the brain performs spike-based computations. Training RSNNs with backpropagation through time (BPTT) faces the challenge of non-differentiable spiking functions, requiring an approximation gradient through each time state of the network. Evolutionary Algorithms (EAs) offer an alternative to BPTT by generating random populations of models and selecting those with the best performance to provide a broader initial search space. When training RSNNs with BPTT, we observe reservoir-like behavior, in which changes in the output layer weights support learning while main recurrent weights largely do not change. It has been unclear whether this behavior is due to improper gradient backpropagation through the recurrent layer or because reservoirs are the most optimal solution to learning the high dimensional dynamics of the recurrent layer. By comparing RSNNs trained using BPTT and EAs, we investigate changes in the different layers of the models throughout training. Our RSNN models have three layers: a linear input layer to a hidden recurrent layer comprised of leaky integrate and fire neurons with three different inhibitory neuron types and a linear output layer. The recurrent layer has biologically realistic connectivity found in mouse neocortex. From initial to final population models, RSNNs trained using EAs have the greatest weight changes in the recurrent layer connections, whereas BPTT-trained RSNNs have greatest weight changes in the input and output layers (see Figure). This demonstrates that reservoirs are not always the optimal solution for these temporal tasks, as alternative network solutions – involving genuine changes in recurrent connectivity – are discovered via EAs. Furthermore, training RSNNs with EAs can better capture recurrent plasticity of the brain compared to BPTT. This makes EAs highly valuable for future investigations into how recurrent neocortical circuits can change their structure to support novel computations.