Spikingformer: A Key Foundation Model for Spiking Neural Networks
Chenlin Zhou, Liutao Yu, Zhaokun Zhou, Han Zhang, Jiaqi Wang, Huihui Zhou, Zhengyu Ma, Yonghong Tian
尖峰神经网络(SNN)为人工神经网络提供了一种有前途的节能替代方案,因为它们的事件驱动尖峰计算。 然而,一些基础SNN骨干(包括Spikformer和SEW ResNet)受到其残余连接结构引起的非尖峰计算(整数浮动乘法)的影响。 这些非尖峰计算增加了SNN的功耗,使它们不适合在主流神经形态硬件上部署。 在本文中,我们分析了 SNN 中剩余连接方法的尖峰驱动行为。 然后,我们介绍了Spkingformer,这是一种新颖的尖峰变压器主干,它将MS Residual连接与Self-Attention合并,以生物学上合理的方式解决Spikformer中的非尖峰计算挑战,同时保持全球建模能力。 我们评估了13个数据集的Spikingformer,这些数据集涉及大型静态图像,神经形态数据和自然语言任务,并展示了Spkingformer的有效性和普遍性,为Spiking神经网络设定了重要的基准。 此外,凭借其尖峰驱动的功能和全球建模功能,Spkingformer有望成为更高效的通用SNN骨干,以实现节能的人工智能。 代码:https://github.com/TheBrainLab/Spikingformer
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks, due to their event-driven spiking computation. However, some foundation SNN backbones (including Spikformer and SEW ResNet) suffer from non-spike computations (integer-float multiplications) caused by the structure of their residual connections. These non-spike computations increase SNNs' power consumption and make them unsuitable for deployment on mainstream neuromorphic hardware. In thi...