Spike(软件开发)
计算机科学
尖峰神经网络
人工神经网络
人工智能
编码(社会科学)
指数函数
模式识别(心理学)
神经编码
提炼听神经的脉冲
算法
语音识别
数学
统计
数学分析
软件工程
作者
Yunhua Chen,Feng Ren,Zuhong Xiong,Jinsheng Xiao,Jian K. Liu
标识
DOI:10.1016/j.neunet.2024.106346
摘要
Spiking neural networks (SNNs) provide necessary models and algorithms for neuromorphic computing. A popular way of building high-performance deep SNNs is to convert ANNs to SNNs, taking advantage of advanced and well-trained ANNs. Here we propose an ANN to SNN conversion methodology that uses a time-based coding scheme, named At-most-two-spike Exponential Coding (AEC), and a corresponding AEC spiking neuron model for ANN-SNN conversion. AEC neurons employ quantization-compensating spikes to improve coding accuracy and capacity, with each neuron generating up to two spikes within the time window. Two exponential decay functions with tunable parameters are proposed to represent the dynamic encoding thresholds, based on which pixel intensities are encoded into spike times and spike times are decoded into pixel intensities. The hyper-parameters of AEC neurons are fine-tuned based on the loss function of SNN-decoded values and ANN-activation values. In addition, we design two regularization terms for the number of spikes, providing the possibility to achieve the best trade-off between accuracy, latency and power consumption. The experimental results show that, compared to other similar methods, the proposed scheme not only obtains deep SNNs with higher accuracy, but also has more significant advantages in terms of energy efficiency and inference latency. More details can be found at https://github.com/RPDS2020/AEC.git.
科研通智能强力驱动
Strongly Powered by AbleSci AI