Stabilizing Direct Training of Spiking Neural Networks: Membrane Potential Initialization and Threshold-robust Surrogate Gradient
Hyunho Kook, Byeongho Yu, Jeong Min Oh, Eunhyeok Park
直接训练的尖峰神经网络(SNN)的最新进展即使在早期也显示出高质量的产出,为新颖的节能AI范式铺平了道路。 然而,SNN中固有的非线性和时间依赖性引入了持续的挑战,例如时间协方差移(TCS)和具有可学习神经元阈值的不稳定梯度流。 在本文中,我们提出了两个关键创新:MP-Init(膜电位初始化)和TrSG(阈值-强健的代孕梯度)。 MP-Init通过将初始膜电位与其固定分布对齐来解决TCS,而TrSG在训练过程中稳定阈值电压的梯度流。 广泛的实验验证了我们的方法,在静态和动态图像数据集上实现了最先进的准确性。 代码可查阅:https://github.com/kookhh0827/SNN-MP-Init-TRSG
Recent advancements in the direct training of Spiking Neural Networks (SNNs) have demonstrated high-quality outputs even at early timesteps, paving the way for novel energy-efficient AI paradigms. However, the inherent non-linearity and temporal dependencies in SNNs introduce persistent challenges, such as temporal covariate shift (TCS) and unstable gradient flow with learnable neuron thresholds. In this paper, we present two key innovations: MP-Init (Membrane Potential Initialization) and TrSG ...