Journal article
Trainable Reference Spikes Improve Temporal Information Processing of SNNs With Supervised Learning
Neural computation, v 36(10), p1
23 Aug 2024
Abstract
Spiking neural networks (SNNs) are the next-generation neural networks composed of biologically plausible neurons that communicate through trains of spikes. By modifying the plastic parameters of SNNs, including weights and time delays, SNNs can be trained to perform various AI tasks, although in general not at the same level of performance as typical artificial neural networks (ANNs). One possible solution to improve the performance of SNNs is to consider plastic parameters other than just weights and time delays drawn from the inherent complexity of the neural system of the brain, which may help SNNs improve their information processing ability and achieve brainlike functions. Here, we propose reference spikes as a new type of plastic parameters in a supervised learning scheme in SNNs. A neuron receives reference spikes through synapses providing reference information independent of input to help during learning, whose number of spikes and timings are trainable by error backpropagation. Theoretically, reference spikes improve the temporal information processing of SNNs by modulating the integration of incoming spikes at a detailed level. Through comparative computational experiments, we demonstrate using supervised learning that reference spikes improve the memory capacity of SNNs to map input spike patterns to target output spike patterns and increase classification accuracy on the MNIST, Fashion-MNIST, and SHD data sets, where both input and target output are temporally encoded. Our results demonstrate that applying reference spikes improves the performance of SNNs by enhancing their temporal information processing ability.Spiking neural networks (SNNs) are the next-generation neural networks composed of biologically plausible neurons that communicate through trains of spikes. By modifying the plastic parameters of SNNs, including weights and time delays, SNNs can be trained to perform various AI tasks, although in general not at the same level of performance as typical artificial neural networks (ANNs). One possible solution to improve the performance of SNNs is to consider plastic parameters other than just weights and time delays drawn from the inherent complexity of the neural system of the brain, which may help SNNs improve their information processing ability and achieve brainlike functions. Here, we propose reference spikes as a new type of plastic parameters in a supervised learning scheme in SNNs. A neuron receives reference spikes through synapses providing reference information independent of input to help during learning, whose number of spikes and timings are trainable by error backpropagation. Theoretically, reference spikes improve the temporal information processing of SNNs by modulating the integration of incoming spikes at a detailed level. Through comparative computational experiments, we demonstrate using supervised learning that reference spikes improve the memory capacity of SNNs to map input spike patterns to target output spike patterns and increase classification accuracy on the MNIST, Fashion-MNIST, and SHD data sets, where both input and target output are temporally encoded. Our results demonstrate that applying reference spikes improves the performance of SNNs by enhancing their temporal information processing ability.
Metrics
Details
- Title
- Trainable Reference Spikes Improve Temporal Information Processing of SNNs With Supervised Learning
- Creators
- Zeyuan Wang - Drexel UniversityLuis R Cruz Cruz - Drexel University, PhysicsLuis Cruz - Drexel University
- Publication Details
- Neural computation, v 36(10), p1
- Publisher
- MIT PRESS
- Number of pages
- 34
- Grant note
- National Science Foundation: 1919691
This work was carried out at the Department of Physics of Drexel University. This research did not receive any specific grant from funding agencies in the public, commercial, or nonprofit sectors. The work reported here was run on hardware supported by the National Science Foundation under grant MRI#1919691 and Drexel's University Research Computing Facility.
- Resource Type
- Journal article
- Language
- English
- Academic Unit
- Physics
- Web of Science ID
- WOS:001326682300001
- Scopus ID
- 2-s2.0-85205741328
- Other Identifier
- 991021903712704721
InCites Highlights
Data related to this publication, from InCites Benchmarking & Analytics tool:
- Web of Science research areas
- Computer Science, Artificial Intelligence
- Neurosciences