遗忘
计算机科学
班级(哲学)
人工智能
基础(拓扑)
机器学习
合成孔径雷达
编码器
模式识别(心理学)
数据挖掘
数学
语言学
操作系统
数学分析
哲学
作者
Yuan Tai,Yihua Tan,Shengzhou Xiong,Jinwen Tian
标识
DOI:10.1109/tgrs.2023.3248601
摘要
Recently, few-shot learning (FSL) has received increasing attention because of difficulties in sample collection in some application scenarios, such as maritime surveillance using synthetic aperture radar (SAR) or infrared images. In real situations of such scenarios, it is a common requirement that the model can recognize novel classes incrementally, namely class-incremental learning (CIL). Considering the above requirement, a novel problem that recognizes novel classes incrementally when both the base and novel class samples are scarce is proposed in this article. It is called complete few-shot CIL (C-FSCIL) for distinguishing from the FSCIL that assumes sufficient samples of base classes. Specifically, the following challenges of C-FSCIL are focused on: 1) distance measurement is used for recognizing novel classes incrementally, but the encoder is difficult to be learned well when base class samples are scarce, making some features unsuitable for calculating the distance, decreasing the performance and 2) the catastrophic forgetting problem becomes more difficult to be alleviated than that in FSCIL because of the scarcity of base class samples. To tackle both challenges, mine-distill-prototypes (MDP) algorithm is proposed, which consists of two parts: 1) prototypes-distillation (PD) network is proposed to learn to distill the features and prototypes into a lower dimensional in which ineffective features are eliminated and 2) the prototypes-weight (PW) network and the prototypes-selection (PS) training strategy are proposed for the catastrophic forgetting problem, which aims to capture the relationship between the base and novel prototypes. The superior performance of the proposed algorithm is demonstrated by the experiments on three datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI