Self-Supervised Few Shot Learning
Prototypical Contrastive Learning with Graphs
More Info
expand_more
Abstract
A primary trait of humans is the ability to learn rich representations and relationships between entities from just a handful of examples without much guidance. Unsupervised few-shot learning is an undertaking aimed at reducing this fundamental gap between smart human adaptability and machines. We present a contrastive learning scheme for unsupervised few-shot classification, where we supplement a convolutional network’s strong inductive prior with a self-attention based message passing neural network to exploit intra-batch relations between images. We also show that an optimal-transport (OT) based task-awareness algorithm generates task-representative prototypes that lead to more accurate classification and aid in elevating the robustness of pre-trained models. We show that our approach (SAMPTransfer) offers appreciable performance improvements over its competitors in both in/cross-domain few shot classification scenarios, setting new standards in the miniImagenet, tieredImagenet and CDFSL benchmarks.