树突棘
MNIST数据库
过度拟合
人工神经网络
突触重量
计算机科学
突触
神经科学
脊柱(分子生物学)
人工智能
突触可塑性
生物
生物化学
海马结构
分子生物学
受体
作者
Feifei Zhao,Yi Zeng,Jun Bai
摘要
Neural networks with a large number of parameters are prone to overfitting problems when trained on a relatively small training set. Introducing weight penalties of regularization is a promising technique for solving this problem. Taking inspiration from the dynamic plasticity of dendritic spines, which plays an important role in the maintenance of memory, this letter proposes a brain-inspired developmental neural network based on dendritic spine dynamics (BDNN-dsd). The dynamic structure changes of dendritic spines include appearing, enlarging, shrinking, and disappearing. Such spine plasticity depends on synaptic activity and can be modulated by experiences-in particular, long-lasting synaptic enhancement/suppression (LTP/LTD), coupled with synapse formation (or enlargement)/elimination (or shrinkage), respectively. Subsequently, spine density characterizes an approximate estimate of the total number of synapses between neurons. Motivated by this, we constrain the weight to a tunable bound that can be adaptively modulated based on synaptic activity. Dynamic weight bound could limit the relatively redundant synapses and facilitate the contributing synapses. Extensive experiments demonstrate the effectiveness of our method on classification tasks of different complexity with the MNIST, Fashion MNIST, and CIFAR-10 data sets. Furthermore, compared to dropout and L2 regularization algorithms, our method can improve the network convergence rate and classification performance even for a compact network.
科研通智能强力驱动
Strongly Powered by AbleSci AI