神经形态工程学
MNIST数据库
记忆电阻器
材料科学
计算机科学
激活函数
人工神经网络
拓扑(电路)
纳米技术
人工智能
电子工程
电气工程
工程类
作者
Jungyeop Oh,Sung-Kyu Kim,Changhyeon Lee,Jun‐Hwe Cha,Sang Yoon Yang,Sung Gap Im,Cheolmin Park,Byung Chul Jang,Sung‐Yool Choi
标识
DOI:10.1002/adma.202300023
摘要
With advances in artificial intelligent services, brain-inspired neuromorphic systems with synaptic devices are recently attracting significant interest to circumvent the von Neumann bottleneck. However, the increasing trend of deep neural network parameters causes huge power consumption and large area overhead of a nonlinear neuron electronic circuit, and it incurs a vanishing gradient problem. Here, a memristor-based compact and energy-efficient neuron device is presented to implement a rectifying linear unit (ReLU) activation function. To emulate the volatile and gradual switching of the ReLU function, a copolymer memristor with a hybrid structure is proposed using a copolymer/inorganic bilayer. The functional copolymer film developed by introducing imidazole functional groups enables the formation of nanocluster-type pseudo-conductive filaments by boosting the nucleation of Cu nanoclusters, causing gradual switching. The ReLU neuron device is successfully demonstrated by integrating the memristor with amorphous InGaZnO thin-film transistors, and achieves 0.5 pJ of energy consumption based on sub-10 µA operation current and high-speed switching of 650 ns. Furthermore, device-to-system-level simulation using neuron devices on the MNIST dataset demonstrates that the vanishing gradient problem is effectively resolved by five-layer deep neural networks. The proposed neuron device will enable the implementation of high-density and energy-efficient hardware neuromorphic systems.
科研通智能强力驱动
Strongly Powered by AbleSci AI