记忆
MNIST数据库
计算机科学
深层神经网络
稳健性(进化)
人工智能
深度学习
人工神经网络
噪声数据
训练集
机器学习
培训(气象学)
数据建模
数据库
数学
生物化学
化学
物理
数学教育
气象学
基因
作者
Bo Han,Quanming Yao,Xingrui Yu,Gang Niu,Miao Xu,Weihua Hu,Ivor W. Tsang,Masashi Sugiyama
出处
期刊:Neural Information Processing Systems
日期:2018-01-01
卷期号:31: 8527-8537
被引量:1265
标识
DOI:10.5555/3327757.3327944
摘要
Deep learning with noisy labels is practically challenging, as the capacity of deep models is so high that they can totally memorize these noisy labels sooner or later during training. Nonetheless, recent studies on the memorization effects of deep neural networks show that they would first memorize training data of clean labels and then those of noisy labels. Therefore in this paper, we propose a new deep learning paradigm called ''Co-teaching'' for combating with noisy labels. Namely, we train two deep neural networks simultaneously, and let them teach each other given every mini-batch: firstly, each network feeds forward all data and selects some data of possibly clean labels; secondly, two networks communicate with each other what data in this mini-batch should be used for training; finally, each network back propagates the data selected by its peer network and updates itself. Empirical results on noisy versions of MNIST, CIFAR-10 and CIFAR-100 demonstrate that Co-teaching is much superior to the state-of-the-art methods in the robustness of trained deep models.
科研通智能强力驱动
Strongly Powered by AbleSci AI