过度拟合
计算机科学
推论
人工智能
班级(哲学)
特征(语言学)
机器学习
会话(web分析)
编码(集合论)
一般化
水准点(测量)
模式识别(心理学)
人工神经网络
数学
集合(抽象数据类型)
大地测量学
程序设计语言
地理
哲学
万维网
数学分析
语言学
作者
Weiwei Zhou,Guoqiang Xiao,Michael S. Lew,Song Wu
标识
DOI:10.1145/3652583.3658098
摘要
Few-Shot Class-Incremental Learning (FSCIL) aims to keep recognizing novel classes from a limited number of samples after training on abundant data from base classes while maintaining the performance of the old classes. The challenge, however, is that limited data from new classes not only leads to the issue of overfitting but also catastrophic forgetting. To address these two issues, we propose a causal inference strategy in the mainstream FSCIL framework, which encourages the model to learn significant knowledge in the base training session and enhance the model's ability to extract features to cope with the emergence of unseen classes in the incremental session, by improving the learning of causal relationships between features and predictions for perturbed samples. In addition, to improve the effectiveness of learning new tasks in the incremental sessions while preventing the model from overfitting to the novel class data, we freeze the feature extractor while adding a Fourier transform after the feature extractor in the incremental session. It can denoise the features, strengthen the features of the novel classes, and suppress the error in extracting the features of the limited number of samples directly from the feature extractor. Extensive experiments on CIFAR100, Caltech-USCD Birds-200-2011, and miniImageNet datasets show that our proposed framework achieves state-of-the-art performance on FSCIL. The source code of our designed framework is at https://github.com/SWU-CS-MediaLab/CIFSCIL.
科研通智能强力驱动
Strongly Powered by AbleSci AI