Data-efficient resolution transfer with Continuous Kernel Convolutional Neural Networks

Master Thesis (2023)
Author(s)

L.A. Haarman (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

J.C. Van Gemert – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

R. Bruintjes – Graduation committee member (TU Delft - Pattern Recognition and Bioinformatics)

Michael Weinmann – Coach (TU Delft - Computer Graphics and Visualisation)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2023 Luuk Haarman
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Luuk Haarman
Graduation Date
11-09-2023
Awarding Institution
Delft University of Technology
Programme
['Computer Science | Data Science and Technology']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Convolutional Neural Networks (CNNs) benefit from fine-grained details in high-resolution images, but these images are not always easily available as data collection can be expensive or time-consuming. Transfer learning pre-trains models on data from a related domain before fine-tuning on the main domain, and is a common strategy to deal with limited data. However, transfer learning requires a similar domain with enough available data to exist, and transferability varies from task to task. To deal with limited high-resolution data we propose resolution transfer: using low-resolution data to improve high-resolution accuracy. For resolution transfer, we use Continuous kernel CNNs (CKCNNs) that can adapt their kernel size to changes in resolution and perform well on unseen resolutions. Training CKCNNs on high-resolution images is currently significantly slower than CNNs. We lower the inference costs of CKCNNs to enable training on high-resolution data. We introduce a CKCNN parameterization that constrains the frequencies of kernels to avoid distortions when the kernel size is changed, improving resolution transfer accuracy. We improve fine-tuning with a High-Frequency Adaptation module that complements our constrained kernels. We demonstrate that CKCNNs with kernel resolution adaptation outperform CNNs for resolution transfer tasks with no fine-tuning or with limited fine-tuning data. We compare to transfer learning, and achieve competitive classification accuracy with an ImageNet pre-trained ResNet-18. Our method provides an alternative to transfer learning that uses low-resolution data to improve classification accuracy when high-resolution data is limited.

Files

License info not available