Regularization Effect of Dropout

More Info
expand_more

Abstract

Dropout is one of the most popular regularization methods used in deep learning. The general form of dropout is to add random noise to the training process, limiting the complexity of the models and preventing overfitting. Evidence has shown that dropout can effectively reduce overfitting. This thesis project will show some results where dropout regularizes the deep neural networks only under certain circumstances. Potential explanations would be discussed. Our major contributions are 1. summarizing the regularization behaviors of dropout, including how different hyper-parameters could affect dropout's performance; 2. proposing possible explanations to the dropout's regularization behaviors.