Exploring Convolutional Neural Networks

More Info
expand_more

Abstract

In this thesis we have looked into the complexity of neural networks. Especially convolutional neural networks (CNNs), which are useful for image recognition, are looked into. In order to better understand the process in the neural networks, in the first half of this report a mathematical foundation for neural networks and CNNs is constructed. After this, during different experiments on a simple CNN, a better understanding of the variables that can be chosen for the network is gained. In the first experiment the network was trained on two translated images of a cat in order to test the common believe that CNNs are translation invariant. The result of this experiment refuted this believe and led to the understanding of translation equivariance, with which translation invariance often is confused. After this, the network was trained to learn the difference between a cat and a dog. From this, it appeared that the network can become translation invariant if the right training data is chosen. Also, a look was taken at the kernels of the convolution layer and the feature maps they produce. Finally, there was looked into an other transformation of the images, namely rotation. This led to the idea that the resolution of the images has influence on classifying the rotated cats and dogs.

Files

License info not available