遗忘
排名(信息检索)
计算机科学
财产(哲学)
人工智能
卷积神经网络
班级(哲学)
特征(语言学)
集合(抽象数据类型)
机器学习
人工神经网络
模式识别(心理学)
语言学
认识论
哲学
程序设计语言
作者
Yu Liu,Xiaopeng Hong,Xiaoyu Tao,Songlin Dong,Jingang Shi,Yihong Gong
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2022-01-01
卷期号:: 1-12
被引量:1
标识
DOI:10.1109/tnnls.2022.3144183
摘要
Deep models have shown to be vulnerable to catastrophic forgetting, a phenomenon that the recognition performance on old data degrades when a pre-trained model is fine-tuned on new data. Knowledge distillation (KD) is a popular incremental approach to alleviate catastrophic forgetting. However, it usually fixes the absolute values of neural responses for isolated historical instances, without considering the intrinsic structure of the responses by a convolutional neural network (CNN) model. To overcome this limitation, we recognize the importance of the global property of the whole instance set and treat it as a behavior characteristic of a CNN model relevant to model incremental learning. On this basis: 1) we design an instance neighborhood-preserving (INP) loss to maintain the order of pair-wise instance similarities of the old model in the feature space; 2) we devise a label priority-preserving (LPP) loss to preserve the label ranking lists within instance-wise label probability vectors in the output space; and 3) we introduce an efficient derivable ranking algorithm for calculating the two loss functions. Extensive experiments conducted on CIFAR100 and ImageNet show that our approach achieves the state-of-the-art performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI