尖峰神经网络
计算机科学
神经形态工程学
量化(信号处理)
延迟(音频)
人工神经网络
人工智能
剪裁(形态学)
计算
高效能源利用
模式识别(心理学)
算法
工程类
哲学
电气工程
电信
语言学
作者
Nguyen-Dong Ho,Ik‐Joon Chang
标识
DOI:10.1109/jetcas.2023.3328863
摘要
Spiking Neural Networks (SNNs) mimic the behavior of biological neurons. Unlike traditional Artificial Neural Networks (ANNs) that operate in a continuous time domain and use activation functions to process information, SNNs operate discrete event-driven, where data is encoded and communicated through spikes or discrete events. This unique approach offers several advantages, such as efficient computation and lower power consumption, making SNNs particularly attractive for energy-constrained and neuromorphic applications. However, training SNNs poses significant challenges due to the discrete nature of spikes and the non-differentiable behavior they exhibit. As a result, converting pre-trained ANNs into SNNs has gained attention as a convenient approach. While this approach simplifies the training process, it introduces certain drawbacks, including high latency. The conversion of ANNs to SNNs typically leads to a loss of accuracy, which can be attributed to various factors, including quantization, clipping, and timing errors. Previous studies have proposed techniques to mitigate quantization and clipping errors during the conversion process. However, they do not consider timing errors, degrading SNN accuracies at low latency conditions. This work introduces the MiCE conversion method, which offers a comprehensive joint optimization strategy to simultaneously alleviate quantization, clipping, and timing errors. At a moderate latency of 8 time-steps, our converted ResNet-20 achieves classification accuracies of 79.02% and 95.74% on the CIFAR-100 and CIFAR-10 datasets, respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI