Second International Workshop on Symbolic-Neural Learning (SNL-2018)

July 5-6, 2018
Nagoya Congress Center (Nagoya, Japan)

Co-teaching: Robust Training Deep Neural Networks with Extremely Noisy Labels

Bo Han (UTS & RIKEN), Quanming Yao (4Paradigm), Xingrui Yu (UTS), Gang Niu (UTS), Miao Xu (UTS), Weihua Hu (The University of Tokyo/RIKEN), Ivor Tsang (UTS), and Masashi Sugiyama (RIKEN/The University of Tokyo)

Abstract:

It is challenging to train deep neural networks robustly with noisy labels, as the capacity of deep neural networks is so high that they can totally over-fit on these noisy labels. In this paper, motivated by the memorization effects of deep networks, which shows networks fit clean instances first and then noisy ones, we present a new paradigm called "Co-teaching" combating with noisy labels. We train two networks simultaneously. First, in each mini-batch data, each network filters noisy instances based on memorization effects. Then, it teaches the remained instances to its peer network for updating the parameters. Empirical results on benchmark datasets demonstrate that, the robustness of deep learning models trained by Co-teaching approach is much superior than that of state-of-the-art methods.