Colloquium: Smooth Exact Gradient Descent Learning in Spiking Neural Networks
11 November 2024
Colloquium by Dr. Christian Klos, University of Bonn
Abstract:
In my talk, I will show that and how they can be solved. Specifically, I demonstrate exact gradient descent learning based on spiking dynamics that change continuously, i.e. without disruptive spike (dis)appearances, or even smoothly. The dynamics are generated by neuron models whose spikes vanish and appear only at the end of a trial, where this does not influence any future dynamics. Among others, neuron models that generate spikes via a self-amplification mechanism, like real neurons, and reach infinite voltage in finite time fulfill this condition; this includes the standard quadratic leaky integrate-and-fire (QIF) neuron. Besides spike removal, such neuron models also enable gradient-based spike addition by means of what we call pseudospikes. The timings of pseudospikes are continuous and mostly smooth extensions of the times of ordinary spikes disappearing at the trial end. To demonstrate the applicability of our scheme, I will show results for the training of individual and networks of QIF neurons. Our scheme allows, in particular, to induce and continuously move spikes to desired times, in single neurons and recurrent networks. Further, it achieves competitive performance on MNIST using deep, initially silent networks and time-to-first-spike coding. Taken together, our results show how non-disruptive, exact learning is possible despite discrete spikes.
Date and time: 11 Nov 2024, 11:00 p.m., hybrid event, Seminar Room W 36, Computational Neuroscience, UKE, or zoom webinar (link mailed to list)