Ridge Regression and Random Neural Networks under High-Dimensional Conditions

Derivative-Based Calibration for Stable Learning

Master Thesis (2025)
Author(s)

I. Ruban (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

N. Parolya – Mentor (TU Delft - Statistics)

N.V. Budko – Graduation committee member (TU Delft - Numerical Analysis)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
27-10-2025
Awarding Institution
Delft University of Technology
Programme
['Applied Mathematics']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This thesis investigates the problem of selecting an optimal regularization parameter in high-dimensional ridge regression models with random features. The work is situated at the intersection of machine learning, statistical signal processing, and random matrix theory, and aims to improve the understanding and stability of regression-based learning in high-dimensional settings.

Files

Thesis_Report.pdf
(pdf | 7.77 Mb)
License info not available