The Effect of Different Initialization Methods on VAEs for Modeling Cancer using RNA Genome Expressions

Bachelor Thesis (2021)
Author(s)

I.S. Kroskinski (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Marcel J.T. Reinders – Graduation committee member (TU Delft - Pattern Recognition and Bioinformatics)

S. Makrodimitris – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Tamim R. Abdelaal – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Mohammed Charrout – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Mostafa Eltager – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

E. Isufi – Coach (TU Delft - Multimedia Computing)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2021 Ivo Kroskinski
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Ivo Kroskinski
Graduation Date
02-07-2021
Awarding Institution
Delft University of Technology
Project
CSE3000 Research Project
Programme
Computer Science and Engineering
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Variational Auto-Encoders are a class of machine learning models that have been used in varying context, such as cancer research. Earlier research has shown that initialization plays a crucial part in training these models, since it can increase performance. Therefore, this paper studies the effect initialization methods on VAEs. This research shows that if using only one hidden layer, Uniform methods and Xavier methods perform best depending on the VAE model, where the standard VAE shows the most sensitivity to these methods. But, if using more hidden layers, the uniform method performs significantly worse than a method that uses the number of inputs of the layer such as the default implementation of PyTorch, Xavier Normal or XavierUniform. However, after enough epochs in all other models these initialization methods converge.

Files

RP_Research_Paper_6_.pdf
(pdf | 0.483 Mb)
License info not available