计算机科学
人工智能
遗忘
机器学习
水准点(测量)
任务(项目管理)
特征(语言学)
多任务学习
特征学习
深度学习
认知心理学
哲学
地理
管理
经济
语言学
心理学
大地测量学
作者
Shuai Chen,Mingyi Zhang,Junge Zhang,Kaiqi Huang
出处
期刊:IEEE transactions on artificial intelligence
[Institute of Electrical and Electronics Engineers]
日期:2024-01-22
卷期号:5 (7): 3313-3324
被引量:3
标识
DOI:10.1109/tai.2024.3355879
摘要
Despite the impressive performance of deep learning models, they suffer from catastrophic forgetting, which refers to a significant decline in overall performance when trained with new classes added incrementally. The primary reason for this phenomenon is the overlapping or confusion between the feature space representations of old and new classes. In this study, we examine this issue and propose a model that can mitigate the problem by learning more transferable features. We employ contrastive learning, a recent breakthrough in deep learning, which can learn visual representations better than the task-specific supervision method. Specifically, we introduce an exemplar-based continual learning method using contrastive learning to learn a task-agnostic and continuously improved feature expression. However, the class imbalance between old and new samples in continual learning can affect the final learned features. To address this issue, we propose two approaches. First, we use a novel exemplar-based method, called determinantal point processes experience replay, to improve buffer diversity during memory update. Second, we propose an old sample compensation weight to resist the corruption of the old model caused by new task learning during memory retrieval. Our experimental results on benchmark datasets demonstrate that our approach outperforms state-of-the-art methods in terms of comparable performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI