What Affects Learned Equivariance in Deep Image Recognition Models?

Conference Paper (2023)
Author(s)

R. Bruintjes (TU Delft - Pattern Recognition and Bioinformatics)

Tomasz Motyka (Synerise)

Jan van Van Gemert (TU Delft - Pattern Recognition and Bioinformatics)

Research Group
Pattern Recognition and Bioinformatics
Copyright
© 2023 R. Bruintjes, Tomasz Motyka, J.C. van Gemert
DOI related publication
https://doi.org/10.1109/CVPRW59228.2023.00512
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 R. Bruintjes, Tomasz Motyka, J.C. van Gemert
Research Group
Pattern Recognition and Bioinformatics
Pages (from-to)
4839-4847
ISBN (print)
979-8-3503-0250-9
ISBN (electronic)
979-8-3503-0249-3
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Equivariance w.r.t. geometric transformations in neural networks improves data efficiency, parameter efficiency and robustness to out-of-domain perspective shifts. When equivariance is not designed into a neural network, the network can still learn equivariant functions from the data. We quantify this learned equivariance, by proposing an improved measure for equivariance. We find evidence for a correlation between learned translation equivariance and validation accuracy on ImageNet. We therefore investigate what can increase the learned equivariance in neural networks, and find that data augmentation, reduced model capacity and inductive bias in the form of convolutions induce higher learned equivariance in neural networks.

Files

What_Affects_Learned_Equivaria... (pdf)
(pdf | 4.28 Mb)
- Embargo expired in 14-02-2024
License info not available