Regularization Effect of Dropout

Master Thesis (2021)
Author(s)

X. Zhao (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

David Tax – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Marcel J.T. Reinders – Graduation committee member (TU Delft - Pattern Recognition and Bioinformatics)

F. H. van Meulen – Graduation committee member (TU Delft - Statistics)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2021 Xunyi Zhao
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Xunyi Zhao
Graduation Date
27-05-2021
Awarding Institution
Delft University of Technology
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Dropout is one of the most popular regularization methods used in deep learning. The general form of dropout is to add random noise to the training process, limiting the complexity of the models and preventing overfitting. Evidence has shown that dropout can effectively reduce overfitting. This thesis project will show some results where dropout regularizes the deep neural networks only under certain circumstances. Potential explanations would be discussed. Our major contributions are 1. summarizing the regularization behaviors of dropout, including how different hyper-parameters could affect dropout's performance; 2. proposing possible explanations to the dropout's regularization behaviors.

Files

License info not available