计算机科学
班级(哲学)
人工智能
一次性
弹丸
机器学习
机械工程
工程类
有机化学
化学
作者
Shuo Li,Fang Liu,Licheng Jiao,Lingling Li,Puhua Chen,Xu Liu,Wenping Ma
标识
DOI:10.1109/tcsvt.2025.3525545
摘要
Few-Shot Class-Incremental Learning (FSCIL) faces a huge stability-plasticity challenge due to continuously learning knowledge from new classes with a small number of training samples without forgetting the knowledge of previously seen old classes. To alleviate this challenge, we propose a novel method called Prompt-based Concept Learning (PCL) for FSCIL, which generalizes conceptual knowledge learned from old classes to new classes by simulating human learning capabilities. In our PCL, in the base session, we simultaneously learn common basic concepts from the training data and the class-concept weight of each class in a prompt learning manner, and in each incremental session, class-concept weights between new classes and previously learned basic concepts are learned to achieve incremental learning. Furthermore, in order to avoid catastrophic forgetting, we propose a distribution estimation module to retain feature distributions of previously seen classes and a data replay module to randomly sample features of previously seen classes in incremental sessions. We verify the effectiveness of our PCL on widely used benchmarks, such as miniImageNet, CIFAR-100, and CUB-200. Experimental results show that our PCL achieves competitive results compared with other state-of-the-art methods, especially we achieve an average accuracy of 94.02% across all sessions on the miniImageNet benchmark.
科研通智能强力驱动
Strongly Powered by AbleSci AI