Ridge Regression and Random Neural Networks under High-Dimensional Conditions
Derivative-Based Calibration for Stable Learning
I. Ruban (TU Delft - Electrical Engineering, Mathematics and Computer Science)
N. Parolya – Mentor (TU Delft - Statistics)
N.V. Budko – Graduation committee member (TU Delft - Numerical Analysis)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
This thesis investigates the problem of selecting an optimal regularization parameter in high-dimensional ridge regression models with random features. The work is situated at the intersection of machine learning, statistical signal processing, and random matrix theory, and aims to improve the understanding and stability of regression-based learning in high-dimensional settings.