计算机科学
遗忘
人工智能
机器学习
班级(哲学)
渐进式学习
水准点(测量)
深度学习
分类器(UML)
源代码
大地测量学
语言学
操作系统
哲学
地理
作者
Da-Wei Zhou,Qiwei Wang,Zhihong Qi,Han-Jia Ye,De‐Chuan Zhan,Ziwei Liu
标识
DOI:10.1109/tpami.2024.3429383
摘要
Deep models, e.g., CNNs and Vision Transformers, have achieved impressive achievements in many vision tasks in the closed world.However, novel classes emerge from time to time in our ever-changing world, requiring a learning system to acquire new knowledge continually.Class-Incremental Learning (CIL) enables the learner to incorporate the knowledge of new classes incrementally and build a universal classifier among all seen classes.Correspondingly, when directly training the model with new class instances, a fatal problem occurs -the model tends to catastrophically forget the characteristics of former ones, and its performance drastically degrades.There have been numerous efforts to tackle catastrophic forgetting in the machine learning community.In this paper, we survey comprehensively recent advances in class-incremental learning and summarize these methods from several aspects.We also provide a rigorous and unified evaluation of 17 methods in benchmark image classification tasks to find out the characteristics of different algorithms empirically.Furthermore, we notice that the current comparison protocol ignores the influence of memory budget in model storage, which may result in unfair comparison and biased results.Hence, we advocate fair comparison by aligning the memory budget in evaluation, as well as several memory-agnostic performance measures.The source code is available at https://github.com/zhoudw-zdw/CIL_Survey/.
科研通智能强力驱动
Strongly Powered by AbleSci AI