计算机科学
人工智能
记忆模型
卷积神经网络
一般化
代表(政治)
人工神经网络
机器学习
共享内存
并行计算
政治学
数学
政治
数学分析
法学
作者
Weiping Ding,Yurui Ming,Yukai Wang,Chin‐Teng Lin
标识
DOI:10.1016/j.neucom.2021.09.012
摘要
The long short-term memory (LSTM) network underpins many achievements and breakthroughs especially in natural language processing fields. Essentially, it is endowed with certain memory capabilities to boost its performance. Currently, the volume and speed of big data generation are increasing exponentially, and such data require efficient models to acquire memory augmented knowledge. In this paper, we propose a memory augmented convolutional neural network (MACNN) with utilizing self-organizing maps (SOM) as the memory module. First, we depict the potential challenge about just applying solely a convolutional neural network (CNN) so as to highlight the advantage of augmenting SOM memory for better network generalization. Then, we dissert a corresponding network architecture incorporating memory to instantiate the distributed knowledge representation machanism, which tactically combines both SOM and CNN. Each component of the input vector is connected with a neuron in a two-dimensional lattice. Finally, we test the proposed network on various datasets and the experimental results reveal that MACNN can achieve competitive performance, especially for bioimages datasets. Meanwhile, we further illustrate the learned representations to interpret the SOM behavior and to comprehend the achieved results, which indicates that the proposed memory-incorporating model can exhibit the better performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI