Relational Deep Learning with Graph Transformers: Exploring Local and Global Message Passing

Bachelor Thesis (2025)
Author(s)

I. Cuñado (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

H.Ç. Bilgi – Mentor (TU Delft - Data-Intensive Systems)

Kubilay Atasu – Mentor (TU Delft - Data-Intensive Systems)

T. Höllt – Graduation committee member (TU Delft - Computer Graphics and Visualisation)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
27-06-2025
Awarding Institution
Delft University of Technology
Project
['CSE3000 Research Project']
Programme
['Computer Science and Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Graph Transformers have played a key role in the latest graph learning developments. However, their application and performance in Relational Deep Learning (RDL), which has huge potential to remove inefficient data pre-processing pipelines, remain largely unexplored. For this reason, we present adaptations to two well-known Graph Transformer models: a relation-aware local message passing variant (FraudGT) that computes separate attention matrices for each edge and node type; and a simplified global-attention version that ignores heterogeneity (Graphormer). Our analysis demonstrates that local relation-aware attention achieves state-of-the-art results on node classification and regression tasks when evaluated against RelBench tasks, a set of comprehensive RDL benchmarks. We show how local message passing is computationally cheaper, faster, more efficient and more accurate than global attention. Our code is available at https://github.com/ignaciocunado/gt-rdl.

Files

CSE3000_Paper-4.pdf
(pdf | 0.546 Mb)
License info not available