人工智能
校准
计算机科学
弹丸
班级(哲学)
一次性
模式识别(心理学)
计算机视觉
数学
统计
工程类
机械工程
化学
有机化学
作者
Binghao Liu,Boyu Yang,Lingxi Xie,Ren Wang,Qi Tian,Qixiang Ye
标识
DOI:10.1109/tpami.2023.3273291
摘要
Few-shot class-incremental learning (FSCIL) faces the challenges of memorizing old class distributions and estimating new class distributions given few training samples. In this study, we propose a learnable distribution calibration (LDC) approach, to systematically solve these two challenges using a unified framework. LDC is built upon a parameterized calibration unit (PCU), which initializes biased distributions for all classes based on classifier vectors (memory-free) and a single covariance matrix. The covariance matrix is shared by all classes, so that the memory costs are fixed. During base training, PCU is endowed with the ability to calibrate biased distributions by recurrently updating sampled features under supervision of real distributions. During incremental learning, PCU recovers distributions for old classes to avoid 'forgetting', as well as estimating distributions and augmenting samples for new classes to alleviate 'over-fitting' caused by the biased distributions of few-shot samples. LDC is theoretically plausible by formatting a variational inference procedure. It improves FSCIL's flexibility as the training procedure requires no class similarity priori. Experiments on CUB200, CIFAR100, and mini-ImageNet datasets show that LDC respectively outperforms the state-of-the-arts by 4.64%, 1.98%, and 3.97%. LDC's effectiveness is also validated on few-shot learning scenarios. The code is available at https://github.com/Bibikiller/LDC .
科研通智能强力驱动
Strongly Powered by AbleSci AI