Self-Attention Message Passing for Contrastive Few-Shot Learning

Conference Paper (2023)
Author(s)

Ojas Kishorkumar Shirekar (Student TU Delft)

Anuj Singh (Student TU Delft)

Hadi Jamali-Rad (Shell Global Solutions International B.V., TU Delft - Pattern Recognition and Bioinformatics)

DOI related publication
https://doi.org/10.1109/WACV56688.2023.00539 Final published version
More Info
expand_more
Publication Year
2023
Language
English
Pages (from-to)
5415-5425
ISBN (print)
978-1-6654-9347-5
ISBN (electronic)
978-1-6654-9346-8
Event
Downloads counter
243
Collections
Institutional Repository
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Humans have a unique ability to learn new representations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a satisfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural networks (GNNs) in discovering complex inter-sample relationships, we propose a novel self-attention based message passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. We also propose an optimal transport (OT) based fine-tuning strategy (we call OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification framework (SAMPTransfer). Our extensive experimental results corroborate the efficacy of SAMPTransferin a variety of downstream few-shot classification scenarios, setting a new state-of-the-art for U-FSL on both miniImageNet and tieredImageNet benchmarks, offering up to 7%+ and 5%+ improvements, respectively. Our further investigations also confirm that SAMPTransferremains on-par with some supervised baselines on miniImageNet and outperforms all existing U-FSL baselines in a challenging cross-domain scenario. Our code can be found in our GitHub repository: https://github.com/ojss/SAMPTransfer/.

Files

Self_Attention_Message_Passing... (pdf)
(pdf | 3.12 Mb)
- Embargo expired in 06-08-2023
License info not available