Applying Fine-Tuning methods to FTTransfomer in Anti Money Laundering applications

Bachelor Thesis (2024)
Author(s)

V.P. de Graaff (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

T.A. Akyıldız – Supervisor 2 (TU Delft - Data-Intensive Systems)

Kubilay Atasu – Supervisor 1 (TU Delft - Data-Intensive Systems)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2024
Language
English
Graduation Date
27-06-2024
Awarding Institution
Delft University of Technology
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This research investigates the effectiveness of combining Feature Tokenizer Transformer (FTTransformer)[6] with graph neural networks for anti-money laundering (AML) applications. We explore various fine-tuning techniques, including LoRA[9] and vanilla fine-tuning, our baseline FTT architecture. Using the IBM AML dataset [1], we compare the performance of different models and fine-tuning approaches. Our results indicate that FTT alone do not outperform GNN’s and careful configuration is required when working with datasets of Multi-Modality. This work contributes to the development of more efficient and accurate methods for detecting financial fraud patterns.

Files

Research_paper_19_.pdf
(pdf | 0.521 Mb)
License info not available