MNIST数据库
神经形态工程学
尖峰神经网络
计算机科学
人工神经网络
峰值时间相关塑性
赫比理论
突触后电位
人工智能
突触可塑性
突触重量
学习规律
利用
实现(概率)
突触
边缘设备
财产(哲学)
计算机体系结构
材料科学
神经科学
微流控
作者
Seongjun Kim,Jeong‐Ick Cho,S.-B. Lee,Yoonchul Shin,Je‐Jun Lee,Taehyuk Jang,Hyeonjung Kim,Junhwa Oh,Sanghyun Lee,Kwanghee Ko,Juncheol Kang,Jun‐Seo Lee,Matthew T. Flavin,Dong‐Ho Kang,Byung Chul Jang,Ji‐Hoon Ahn,Yoonmyung Lee,Sang Min Won,Jin‐Hong Park,Seyong Oh
标识
DOI:10.1002/adma.202517613
摘要
The rapid growth of unstructured data in applications such as autonomous systems and edge AI underscores the urgent need for energy-efficient, real-time computing exemplified by biological brains, where synaptic weights are adjusted according to the timing of neural spikes, known as spike-timing-dependent plasticity (STDP). This work presents the first experimental realization of a multi-channel timing-dependent spiking neural network (TD-SNN) at the board-level by integrating photoelectroactive synaptic devices with an analog leaky integrate-and-fire (LIF) neuron circuit. The synaptic devices exploit the precise timing dependency between electrical presynaptic and optical postsynaptic spikes to emulate STDP, enabling reversible and bidirectional modulation of synaptic weights through photoelectroactive doping. By engineering the shape of presynaptic pulses, the devices demonstrate diverse biological STDP learning rules, including Hebbian, anti-Hebbian, all-LTP, and all-LTD. Integrated single- and multi-channel networks exhibit self-learning, system-level adaptive, and competitive behaviors. Experimentally extracted STDP parameters are implemented in SNN simulations, where network performance is determined by the long-term potentiation/depression area ratio (LTP/D area ratio, PDR) of the STDP curve. When PDR ≥ 1.25, robust pattern classification is achieved, reaching up to 90.9% accuracy on MNIST tasks. These results mark a milestone in timing-dependent neuromorphic hardware, demonstrating device-level feasibility toward adaptive and real-time learning hardware.
科研通智能强力驱动
Strongly Powered by AbleSci AI