蒸馏
人工智能
数学
色谱法
计算机科学
机器学习
化学
作者
Chakkrit Termritthikun,Ayaz Umer,Suwichaya Suwanwimolkul,Ivan Lee
出处
期刊:ICT Express
[Elsevier BV]
日期:2024-11-28
卷期号:11 (2): 364-370
被引量:1
标识
DOI:10.1016/j.icte.2024.11.004
摘要
In saliency prediction, Knowledge Distillation (KD) is leveraged to improve the predictive performance of compact Student Networks. However, the challenge is searching for an optimal teacher–student pair while handling the unavailability of large-scale annotations in the Pseudoknowledge Distillation (PKD). To overcome this challenge, a semi-supervised method is proposed; Semi-PKD. This method involves pseudo-label generation on unlabeled data by a Teacher Network trained using the exponential moving average KD (EMA-KD) method. The EMA-KD method utilizes only the Student Network by acquiring self-knowledge, solving the problem of optimal teacher–student pair selection. Semi-PKD outperforms other state-of-the-art saliency prediction models across various evaluation metrics. The code is available at https://github.com/chakkritte/Semi-PKD.
科研通智能强力驱动
Strongly Powered by AbleSci AI