神经形态工程学
Spike(软件开发)
计算机科学
反向传播
尖峰神经网络
人工神经网络
人工智能
编码(社会科学)
能源消耗
学习规律
神经编码
深度学习
能量(信号处理)
高效能源利用
工程类
数学
统计
软件工程
电气工程
作者
Julian Göltz,Laura Kriener,Andreas Baumbach,Sebastian Billaudelle,Oliver Breitwieser,Benjamin Cramer,Dominik Dold,Ákos F. Kungl,Walter Senn,Johannes Schemmel,K. Meier,Mihai A. Petrovici
标识
DOI:10.1038/s42256-021-00388-x
摘要
For a biological agent operating under environmental pressure, energy consumption and reaction times are of critical importance. Similarly, engineered systems are optimized for short time-to-solution and low energy-to-solution characteristics. At the level of neuronal implementation, this implies achieving the desired results with as few and as early spikes as possible. With time-to-first-spike coding, both of these goals are inherently emerging features of learning. Here, we describe a rigorous derivation of a learning rule for such first-spike times in networks of leaky integrate-and-fire neurons, relying solely on input and output spike times, and show how this mechanism can implement error backpropagation in hierarchical spiking networks. Furthermore, we emulate our framework on the BrainScaleS-2 neuromorphic system and demonstrate its capability of harnessing the system’s speed and energy characteristics. Finally, we examine how our approach generalizes to other neuromorphic platforms by studying how its performance is affected by typical distortive effects induced by neuromorphic substrates. Spiking neural networks promise fast and energy-efficient information processing. The ‘time-to-first-spike’ coding scheme, where the time elapsed before a neuron’s first spike is utilized as the main variable, is a particularly efficient approach and Göltz and Kriener et al. demonstrate that error backpropagation, an essential ingredient for learning in neural networks, can be implemented in this scheme.
科研通智能强力驱动
Strongly Powered by AbleSci AI