遗忘
Softmax函数
渐进式学习
蒸馏
计算机科学
人工智能
机器学习
班级(哲学)
功能(生物学)
人工神经网络
语言学
进化生物学
生物
哲学
有机化学
化学
作者
Darian M. Onchis,Ioan-Valentin Samuila
标识
DOI:10.1109/synasc54541.2021.00039
摘要
Incremental learning models belong to an area of machine learning algorithms dealing with streams of data arriving in sequential order over time. As such, these algorithms are vulnerable to catastrophic forgetting, which is the tendency of the incremental models to forget past information when new data is added. This phenomenon could be attenuated with the use of a knowledge distillation component and with the support of a sample memory from past classes. In here, we propose a double distillation incremental learning recipe for the class incremental learning scenario, starting from a proof of the classification limits of the relaxed SoftMax function and coupling it with a modified version of the iCaRL algorithm in which we have remodeled the last classification layer by varying the temperature parameter. We performed extensive experiments to outline the advantages of using our method compared to the standard algorithm.
科研通智能强力驱动
Strongly Powered by AbleSci AI