Continuous Review and Timely Correction: Enhancing the Resistance to Noisy Labels Via Self-Not-True and Class-Wise Distillation

计算机科学 记忆 人工智能 机器学习 班级(哲学) 人工神经网络 机制(生物学) 过程(计算) 深度学习 适应(眼睛) 提前停车 训练集 任务(项目管理) 循环神经网络
作者
Long Lan,Jingyi Wang,Xinghao Wu,Bo Han,Xinwang Liu
出处
期刊:IEEE Transactions on Pattern Analysis and Machine Intelligence [Institute of Electrical and Electronics Engineers]
卷期号:PP: 1-15
标识
DOI:10.1109/tpami.2025.3649111
摘要

Deep neural networks possess remarkable learning capabilities and expressive power, but this makes them vulnerable to overfitting, especially when they encounter mislabeled data. A notable phenomenon called the memorization effect occurs when networks first learn the correctly labeled data and later memorize the mislabeled instances. While early stopping can mitigate overfitting, it doesn't entirely prevent networks from adapting to incorrect labels during the initial training phases, which can result in losing valuable insights from accurate data. Moreover, early stopping cannot rectify the mistakes caused by mislabeled inputs, underscoring the need for improved strategies. In this paper, we introduce an innovative mechanism for continuous review and timely correction of learned knowledge. Our approach allows the network to repeatedly revisit and reinforce correct information while promptly addressing any inaccuracies stemming from mislabeled data. We present a novel method called self-not-true-distillation (SNTD). This technique employs self-distillation, where the network from previous training iterations acts as a teacher, guiding the current network to review and solidify its understanding of accurate labels. Crucially, SNTD masks the true class label in the logits during this process, concentrating on the non-true classes to correct any erroneous knowledge that may have been acquired. We also recognize that different data classes follow distinct learning trajectories. A single teacher network might struggle to effectively guide the learning of all classes at once, which necessitates selecting different teacher networks for each specific class. Additionally, the influence of the teacher network's guidance varies throughout the training process. To address these challenges, we propose SNTD+, which integrates a class-wise distillation strategy along with a dynamic weight adjustment mechanism. Together, these enhancements significantly bolster SNTD's robustness in tackling complex scenarios characterized by label noise.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
1秒前
1秒前
1秒前
2秒前
科研通AI6应助冷傲小蜜蜂采纳,获得30
3秒前
pluto应助有点儿采纳,获得10
3秒前
pluto应助有点儿采纳,获得10
3秒前
pluto应助有点儿采纳,获得10
3秒前
3秒前
3秒前
3秒前
4秒前
4秒前
4秒前
4秒前
4秒前
黄宇凡完成签到,获得积分10
4秒前
4秒前
NexusExplorer应助贪玩的秋柔采纳,获得10
5秒前
5秒前
5秒前
5秒前
5秒前
苗条尔白发布了新的文献求助10
5秒前
5秒前
6秒前
6秒前
6秒前
6秒前
6秒前
6秒前
6秒前
6秒前
7秒前
7秒前
7秒前
lll发布了新的文献求助10
7秒前
7秒前
7秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Introduction to strong mixing conditions volume 1-3 5000
Clinical Microbiology Procedures Handbook, Multi-Volume, 5th Edition 2000
从k到英国情人 1500
Ägyptische Geschichte der 21.–30. Dynastie 1100
„Semitische Wissenschaften“? 1100
Real World Research, 5th Edition 800
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5736480
求助须知:如何正确求助?哪些是违规求助? 5366181
关于积分的说明 15333226
捐赠科研通 4880292
什么是DOI,文献DOI怎么找? 2622803
邀请新用户注册赠送积分活动 1571698
关于科研通互助平台的介绍 1528511