计算机科学
蒸馏
相似性(几何)
人工智能
关系数据库
数据挖掘
色谱法
化学
图像(数学)
作者
Xiaomeng Xin,Heping Song,Jianping Gou
标识
DOI:10.1109/icassp48485.2024.10447596
摘要
The previous relation-based knowledge distillation methods tend to construct global similarity relationship matrix in a mini-batch while ignoring the knowledge of neighbourhood relationship. In this paper, we propose a new similarity-based relational knowledge distillation method that transfers neighbourhood relationship knowledge by selecting K-nearest neighbours for each sample. Our method consists of two components: Neighbourhood Feature Relationship Distillation and Neighbourhood Logits Relationship Distillation. We perform extensive experiments on CIFAR100 and Tiny ImageNet classification datasets and show that our method outperforms the state-of-the-art knowledge distillation methods. Our code is available at: https://github.com/xinxiaoxiaomeng/NRKD.git.
科研通智能强力驱动
Strongly Powered by AbleSci AI