Meta-Learning with label noise

A step towards label few-shot meta-learning with label noise.

More Info
expand_more

Abstract

Few-shot learning presents the challenging problem of learning a task with only a few provided examples. Gradient-Based Meta-Learners (GBML) offer a solution for learning such few-shot problems. These learners approach the few-shot problem by learning an initial parameterization that requires only a few adaptation steps for new tasks. Although these GMBLs are well-studied with correct training data, few have studied the impact of training them with noisy labels.
In this thesis, we show that GMBLs are negatively affected by label noise. We propose a training strategy (BatMan-CLR) leveraging a novel subsampling approach to address the impact of meta-training with label noise. To train and evaluate the different GMBLs, we implement nmfw, a novel framework for extensible training loop definition and few-shot data generation.
Our results show that BatMan-CLR is capable of learning few-shot classification models. We show that our approach can effectively mitigate the impact of meta-training label noise. Even with 60% wrong labels BatMan and Man can limit the meta-testing accuracy drop to 2.5, 9.4, and 1.1 percent points, respectively, with existing meta-learners across the Omniglot, CifarFS, and MiniImagenet datasets.

Files