Exploring the benefits of Graph Transformers in Relational Deep Learning

Bachelor Thesis (2025)
Author(s)

R.A. Alani (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

H.Ç. Bilgi – Mentor (TU Delft - Data-Intensive Systems)

Kubilay Atasu – Graduation committee member (TU Delft - Data-Intensive Systems)

T. Höllt – Graduation committee member (TU Delft - Computer Graphics and Visualisation)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
30-06-2025
Awarding Institution
Delft University of Technology
Project
['CSE3000 Research Project']
Programme
['Computer Science and Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Heterogeneous datasets hold a large percentage of all digital data that is available. With the rise of the digital medium, they have played a strong part in addressing the need for a structured way of storing data, particularly through the use of relational databases. To better leverage such data, the consensus of researchers has been in favour of using Graph Neural Networks to make predictions and infer pos- sible outcomes. With the rise of the Transformer model and the clear limitations that GNNs inherently have due to their over-smoothing and over-squashing properties, a clear transition occurred into combining and leveraging the properties of both GNNs and Transformer models, Graph Transformers. While this method has been better researched in the context of homogeneous datasets, much remains unexplored in its heterogeneous counterpart. This paper tackles this by applying Graph Transformers to multiple heterogeneous datasets, examining the differences and advantages between interleaved and cascade architectures of GTs and how the homogeneous positional encodings transfer to the heterogeneous context.

Files

License info not available