遗忘
计算机科学
抓住
稳健性(进化)
人工智能
班级(哲学)
机器学习
水准点(测量)
任务(项目管理)
哲学
经济
基因
化学
生物化学
管理
程序设计语言
地理
语言学
大地测量学
作者
Jia-yi Han,Jian‐wei Liu
标识
DOI:10.1109/ijcnn55064.2022.9892699
摘要
Recently, people pay more attention to catastrophic forgetting problem, that is, the ability of the model to recognize old tasks decreases dramatically when new tasks are added incrementally. Previous studies focused on making the outputs or intermediate features of the new model as similar as possible the old model but ignored the inner-class assignment information. We consider that the inner-class information can effectively reflect the association pattern and intrinsic nature of the samples with each other, so that maintaining the inner-class relationship among task data is helpful to alleviate the negative impact of catastrophic forgetting. Contrastive learning exhibits excellent performance under self-supervising tasks, which can enhance robustness and make representation more compact. We propose an Incremental Learning algorithm with Instance-level and Class-level Contrastive loss and Knowledge Distillation (IL-ICCKD) as common constraints. Specifically, we encourage our model to maintain the knowledge learned in the past from perspectives of instance characteristics and inner-class assignment distribution. At the same time, our model uses a spatial group-wise enhanced attention mechanism to make the learned representations grasp the spatial distribution of subfeatures. We extensively evaluate our framework on three popular benchmark datasets and demonstrate the performance beyond other models.
科研通智能强力驱动
Strongly Powered by AbleSci AI