边距(机器学习)
班级(哲学)
计算机科学
人工智能
元学习(计算机科学)
机器学习
弹丸
模式识别(心理学)
任务(项目管理)
工程类
材料科学
冶金
系统工程
作者
Hang Ran,Weijun Li,Lusi Li,Songsong Tian,Xin Ning,Prayag Tiwari
标识
DOI:10.1016/j.ipm.2024.103664
摘要
Few-Shot Class-Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few-shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter-class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter-class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few-shot imbalanced data. To address this gap, we propose a Meta-learning- and NC-based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter-class margin reaches its theoretically best. Motivated by the intuition that "learn how to preserve the margin" matches the meta-learning's goal of "learn how to learn", we embed the loss function in base-session meta-training to preserve the margin for future meta-testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at https://github.com/qihangran/metaNC-FSCIL.
科研通智能强力驱动
Strongly Powered by AbleSci AI