Privacy Robustness Trade-Off Analysis in Decentralised Federated learning

Master Thesis (2025)
Author(s)

Z. Palanciyan (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Richard Heusdens – Mentor (TU Delft - Signal Processing Systems)

Qiongxiu Li – Mentor (Aalborg University)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
20-08-2025
Awarding Institution
Delft University of Technology
Programme
['Electrical Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Federated learning (FL) enables collaborative model training across multiple clients without sharing raw data, offering a promising solution for privacy-sensitive applications. However, as FL becomes more decentralised, balancing data privacy with resilience against adversarial attacks remains a fundamental challenge. This thesis investigates the interplay between privacy-preserving mechanisms such as Differential Privacy, Secure Multi-Party Computation (SMPC), and Subspace Perturbation, and the robustness of adversarial detection in fully decentralised FL networks. By extending information-theoretic bounds and conducting comprehensive experiments under a variety of attack scenarios, we show that stronger privacy guarantees often come at the cost of reduced detection capability. Notably, mechanisms that increase noise or mask updates to protect data privacy tend to obscure the test statistics that detectors rely on, resulting in higher false alarm rates and missed detections. Our results highlight that while privacy and robustness cannot be maximised simultaneously, careful tuning of system parameters and defence strategies can help achieve a practical balance. This work provides theoretical insights and empirical evidence to inform the deployment of privacy-preserving and robust federated learning systems.

Files

License info not available