Print Email Facebook Twitter Learned equivariance in Convolutional Neural Networks Title Learned equivariance in Convolutional Neural Networks Author Motyka, Tomasz (TU Delft Electrical Engineering, Mathematics and Computer Science) Contributor van Gemert, J.C. (mentor) Bruintjes, R. (mentor) de Weerdt, M.M. (graduation committee) Degree granting institution Delft University of Technology Date 2022-01-12 Abstract Aside from developing methods to embed the equivariant priors into the architectures, one can also study how the networks learn equivariant properties. In this work, we conduct a study on the influence of different factors on learned equivariance. We propose a method to quantify equivariance and argue why using the correlation to compare intermediate representations may be a better choice as opposed to other commonly used metrics. We show that imposing equivariance or invariance into the objective function does not influence learning more equivariant features in the early parts of the network. We also study how different data augmentations influence translation equivariance. Furthermore, we show that models with lower capacity learn more translation equivariant features. Lastly, we quantify translation and rotation equivariance in different state-of-the-art image classification models and analyse the correlation between the amount of equivariance and accuracy. Subject Deep LearningCNNEquivariance To reference this document use: http://resolver.tudelft.nl/uuid:5c486ba2-02a9-4893-9842-796491199e7a Part of collection Student theses Document type master thesis Rights © 2022 Tomasz Motyka Files PDF Thesis.pdf 3.42 MB Close viewer /islandora/object/uuid:5c486ba2-02a9-4893-9842-796491199e7a/datastream/OBJ/view