计算机科学
人工智能
尖峰神经网络
光流
神经形态工程学
运动估计
卷积神经网络
人工神经网络
特征提取
特征(语言学)
Spike(软件开发)
模式识别(心理学)
无监督学习
计算机视觉
图像(数学)
哲学
软件工程
语言学
作者
Federico Paredes-Vallés,Kirk Y. W. Scheper,Guido C. H. E. de Croon
标识
DOI:10.1109/tpami.2019.2903179
摘要
The combination of spiking neural networks and event-based vision sensors holds the potential of highly efficient and high-bandwidth optical flow estimation. This paper presents the first hierarchical spiking architecture in which motion (direction and speed) selectivity emerges in an unsupervised fashion from the raw stimuli generated with an event-based camera. A novel adaptive neuron model and stable spike-timing-dependent plasticity formulation are at the core of this neural network governing its spike-based processing and learning, respectively. After convergence, the neural architecture exhibits the main properties of biological visual motion systems, namely feature extraction and local and global motion perception. Convolutional layers with input synapses characterized by single and multiple transmission delays are employed for feature and local motion perception, respectively; while global motion selectivity emerges in a final fully-connected layer. The proposed solution is validated using synthetic and real event sequences. Along with this paper, we provide the cuSNN library, a framework that enables GPU-accelerated simulations of large-scale spiking neural networks. Source code and samples are available at https://github.com/tudelft/cuSNN.
科研通智能强力驱动
Strongly Powered by AbleSci AI