Spiking Neural Network Regularization With Fixed and Adaptive Drop-Keep Probabilities

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

7 Scopus Citations
View graph of relations



Original languageEnglish
Pages (from-to)4096-4109
Journal / PublicationIEEE Transactions on Neural Networks and Learning Systems
Issue number8
Online published11 Feb 2021
Publication statusPublished - Aug 2022


Dropout and DropConnect are two techniques to facilitate the regularization of neural network models, having achieved the state-of-the-art results in several benchmarks. In this paper, to improve the generalization capability of spiking neural networks (SNNs), the two drop techniques are first applied to the state-of-the-art SpikeProp learning algorithm resulting in two improved learning algorithms called SPDO (SpikeProp with Dropout) and SPDC (SpikeProp with DropConnect). In view that a higher membrane potential of a biological neuron implies a higher probability of neural activation, three adaptive drop algorithms, SpikeProp with Adaptive Dropout (SPADO), SpikeProp with Adaptive DropConnect (SPADC), and SpikeProp with Group Adaptive Drop (SPGAD), are proposed by adaptively adjusting the keep probability for training SNNs. A convergence theorem for SPDC is proven under the assumptions of the bounded norm of connection weights and a finite number of equilibria. In addition, the five proposed algorithms are carried out in a collaborative neurodynamic optimization framework to improve the learning performance of SNNs. The experimental results on the four benchmark data sets demonstrate that the three adaptive algorithms converge faster than SpikeProp, SPDO, and SPDC, and the generalization errors of the five proposed algorithms are significantly smaller than that of SpikeProp. Furthermore, the experimental results also show that the five algorithms based on collaborative neurodynamic optimization can be improved in terms of several measures.

Research Area(s)

  • Adaptive, Adaptive systems, Biological neural networks, convergence, DropConnect, Dropout, Membrane potentials, Neurons, Optimization, SpikeProp, spiking neural networks (SNNs), Synapses, Training