Loss functions and neural networks

Comparing different loss functions for NLP neural networks

Bachelor Thesis (2022)
Author(s)

J. Kirchner (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Haixiang Lin – Mentor (TU Delft - Mathematical Physics)

P.R. van Nieuwenhuizen – Graduation committee member (TU Delft - Numerical Analysis)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2022 Joris Kirchner
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Joris Kirchner
Graduation Date
29-08-2022
Awarding Institution
Delft University of Technology
Programme
Applied Mathematics | Applied Physics
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Neural network is an active research field which involves many different (unsolved) issues, for example, different types of configuration of the network architectures, training strategies, etc. Amongst these active issues, the choice of loss (or cost) functions plays an important role in how a neural network model is to be optimized (trained) and how the model will perform after the training. Given the choice of measurement criteria, loss functions measure how far an estimated output is from its true value. And the measurement criteria can change depending on the task in hand and the goal to be met. The objective of this project is to understand the role of different loss functions and to evaluate the dependence of the performance on the loss functions using the language prediction problem.

Files

License info not available