Efficient Recurrent Residual Networks Improved by Feature Transfer

Master Thesis (2017)
Author(s)

Y. Liu (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

S. Pintea – Mentor

J.C. van Gemert – Mentor

Ildiko Suveg – Mentor

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2017
Language
English
Graduation Date
31-08-2017
Awarding Institution
Delft University of Technology
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Over the past several years, deep and wide neural networks have achieved great success in many tasks. However, in real life applications, because the gains usually come at a cost in terms of the system resources (e.g., memory, computation and power consumption), it is impractical to run top-performing but heavy networks such as VGGNet and GoogleNet directly on mobile and embedded devices, like smartphones and cameras. To tackle this problem, we propose the use of recurrent layers in residual networks to reduce the redundant information and save the parameters. Furthermore, with the help of feature map knowledge transfer, the performance of Recurrent Residual Networks (ReResNet) can be improved so as to reach similar accuracy to some complex state-of-the-art architectures on CIFAR-10, even with much fewer parameters. In this paper, we demonstrate the efficiency of ReResNet possibly improved by Feature Transfer on three datasets, CIFAR-10, Scenes and MiniPlaces.

Files

Thesis_YueLiu.pdf
(pdf | 3.14 Mb)
License info not available