Hybrid Graph Representation Learning for Money Laundering Detection

Bachelor Thesis (2025)
Author(s)

M. Frija (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Kubilay Atasu – Mentor (TU Delft - Data-Intensive Systems)

H.Ç. Bilgi – Mentor (TU Delft - Data-Intensive Systems)

T. Höllt – Graduation committee member (TU Delft - Computer Graphics and Visualisation)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
20-06-2025
Awarding Institution
Delft University of Technology
Project
['CSE3000 Research Project']
Programme
['Computer Science and Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Money laundering detection stands as one of the most important challenges in the anti-financial crime sector, given its grave repercussions on the financial industry. The evolving nature of fraud schemes and the increasing volume of financial transactions impose limitations on the detection capabilities of traditional anti-money laundering (AML) systems. In the light of the recent breakthroughs in the field of graph machine learning, graph neural networks (GNNs) and graph transformers (GTs) have emerged as prominent solutions to these limitations, achieving a remarkable performance in detecting complex and broad fraudulent patterns. However, fusing the powerful characteristics of these classes of graph models into a unified framework for fraud detection has been little explored. In this paper, we address this gap by presenting GraphFuse — a hybrid graph representation learning model tailored for money laundering detection in financial transaction graphs. The novel edge centrality and transaction signature encodings offer GraphFuse a slight advantage over the best-performing GNN and GT models, improving upon the best GT baseline by 0.76 p.p. in F1 score. Additionally, we introduce three variants of the Transformer-based component of GraphFuse, each with a different level of computational complexity. The competitive performance of Graph-Fuse is supported by extensive experiments on open-source, large-scale synthetic financial transactions datasets. Our code is available at https://github.com/mfrija/aml-graphfuse.

Files

License info not available