Graph Learning on Tabular Data: Think Global And Local

Full Fusion and Interleaved architectures on IBM’s Anti-Money Laundering Data

Bachelor Thesis (2025)
Author(s)

A. Stefan (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Kubilay Atasu – Mentor (TU Delft - Data-Intensive Systems)

H.Ç. Bilgi – Mentor (TU Delft - Data-Intensive Systems)

T. Höllt – Graduation committee member (TU Delft - Computer Graphics and Visualisation)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
25-06-2025
Awarding Institution
Delft University of Technology
Project
['CSE3000 Research Project']
Programme
['Computer Science and Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

As financial fraud becomes increasingly sophisticated, traditional detection methods struggle to uncover the complex relational patterns underlying illicit behavior. This paper investigates the effectiveness of combining Graph Neural Networks (GNNs) and Transformers for fraud detection on relational data transformed into graph structures. Focusing on the IBM Anti-Money Laundering (AML) dataset, two hybrid architectures are proposed: Interleaved, which alternates between GNNs and Transformers to exploit local and global information sequentially, and Full-Fusion, which fuses parallel GNN and Transformer representations at both feature and decision levels. The results show that integrating Transformers significantly boosts performance over standalone GNN baselines, with improvements up to 10% in the F1 score in small-scale datasets. It is also demonstrated that gating-based fusion strategies enhance model stability and accuracy, and further, that PEARL-based positional encodings do not result in any conclusive improvement of the models. These findings highlight the value of combining local message passing and global attention mechanisms for structured financial anomaly detection, and pave the way for more robust, adaptable graph-based solutions in fraud analytics and more.

Files

License info not available