计算机科学
嵌入
Softmax函数
人工智能
相似性(几何)
模式识别(心理学)
约束(计算机辅助设计)
班级(哲学)
水准点(测量)
最大化
机器学习
数学
数学优化
深度学习
图像(数学)
几何学
地理
大地测量学
作者
Qinglai Wei,Weiqin Zhang
标识
DOI:10.1016/j.neunet.2024.106487
摘要
Class incremental learning is committed to solving representation learning and classification assignments while avoiding catastrophic forgetting in scenarios where categories are increasing. In this work, a unified method named Balanced Embedding Discrimination Maximization (BEDM) is developed to make the intermediate embedding more distinctive. Specifically, we utilize an orthogonality constraint based on doubly-blocked Toeplitz matrix to minimize the correlation of convolution kernels, and an algorithm for similarity visualization is introduced. Furthermore, uneven samples and distribution shift among old and new tasks eventuate strongly biased classifiers. To mitigate the imbalance, we propose an adaptive balance weighting in softmax to compensate insufficient categories dynamically. In addition, hybrid embedding learning is introduced to preserve knowledge from old models, which involves less hyper-parameters than conventional knowledge distillation. Our proposed method outperforms the existing approaches on three mainstream benchmark datasets. Moreover, we technically visualize that our method can produce a more uniform similarity histogram and more stable spectrum. Grad-CAM and t-SNE visualizations further confirm its effectiveness. Code is available at https://github.com/wqzh/BEDM.
科研通智能强力驱动
Strongly Powered by AbleSci AI