尖峰神经网络
计算机科学
神经形态工程学
人工智能
循环神经网络
人工神经网络
深度学习
稳健性(进化)
机器学习
生物化学
基因
化学
作者
Bojian Yin,Federico Corradi,Sander M. Bohté
标识
DOI:10.1038/s42256-023-00650-4
摘要
With recent advances in learning algorithms, recurrent networks of spiking neurons are achieving performance that is competitive with vanilla recurrent neural networks. However, these algorithms are limited to small networks of simple spiking neurons and modest-length temporal sequences, as they impose high memory requirements, have difficulty training complex neuron models and are incompatible with online learning. Here, we show how the recently developed Forward-Propagation Through Time (FPTT) learning combined with novel liquid time-constant spiking neurons resolves these limitations. Applying FPTT to networks of such complex spiking neurons, we demonstrate online learning of exceedingly long sequences while outperforming current online methods and approaching or outperforming offline methods on temporal classification tasks. The efficiency and robustness of FPTT enable us to directly train a deep and performant spiking neural network for joint object localization and recognition, demonstrating the ability to train large-scale dynamic and complex spiking neural network architectures. Memory efficient online training of recurrent spiking neural networks without compromising accuracy is an open challenge in neuromorphic computing. Yin and colleagues demonstrate that training a recurrent neural network consisting of so-called liquid time-constant spiking neurons using an algorithm called Forward-Propagation Through Time allows for online learning and state-of-the-art performance at a reduced computational cost compared with existing approaches.
科研通智能强力驱动
Strongly Powered by AbleSci AI