BatMan-CLR
Making Few-Shots Meta-learners Resilient Against Label Noise
J.M. Galjaard (TU Delft - Data-Intensive Systems)
Robert Birke (University of Turin)
Juan F. Perez (Universidad de los Andes)
Lydia Y. Chen (TU Delft - Data-Intensive Systems, University of Neuchâtel)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
The negative impact of label noise is well studied in classical supervised learning yet remains an open research question in meta-learning. Meta-learners aim to adapt to unseen tasks by learning a good initial model in meta-training and fine-tuning it to new tasks during meta-testing. In this paper, we present an extensive analysis of the impact of label noise on the performance of meta-learners, specifically gradient-based N-way K-shot learners. We show that the accuracy of Reptile, iMAML, and foMAML drops by up to 34% when meta-training is affected by label noise on the three representative datasets: Omniglot, CifarFS, and MiniImageNet. To strengthen the resilience against label noise, we propose two sampling techniques, namely manifold (Man) and batch manifold (BatMan), which transforms the noisy supervised learners into semi-supervised learners to increase the utility of noisy labels. We construct N-way 2-contrastive-shot tasks through augmentation, learn the embedding via a contrastive loss in meta-training, and perform classification through zeroing on the embeddings in meta-testing. We show that our approach can effectively mitigate the impact of meta-training label noise. Even with 60% wrong labels BatMan and Man can limit the meta-testing accuracy drop to 2.5, 9.4, 1.1% points with existing meta-learners across Omniglot, CifarFS, and MiniImageNet, respectively. We provide our code online: https://gitlab.ewi.tudelft.nl/dmls/publications/batman-clr-noisy-meta-learning.
Files
File under embargo until 03-04-2026