Learn together over time

Distributed Multi-frequency time series framework

Master Thesis (2025)
Author(s)

A. Chowdhury (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Y. Chen – Mentor (TU Delft - Data-Intensive Systems)

Thiago Guzella – Mentor (ASML)

A. Shankar – Mentor (TU Delft - Data-Intensive Systems)

Cynthia C. S. Liem – Graduation committee member (TU Delft - Multimedia Computing)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
26-08-2025
Awarding Institution
Delft University of Technology
Programme
['Computer Science']
Sponsors
ASML
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Modern industrial systems, from wind-farm monitoring to economic indicators like GDP generate vast amounts of time series data from diverse sources. These data streams are sampled at varying and often inconsistent frequencies, presenting challenges for accurate forecasting. Furthermore, in many real-world scenarios, data are distributed across nodes or tasks, introducing complications due to heterogeneity across tasks. Existing forecasting approaches typically address frequency misalignment and decentralized learning as separate problems, limiting their ability to model real-world deployments effectively. We propose CrossFreqNet, a unified multitask encoder–decoder architecture that addresses both challenges: (i.) integrating multi-frequency data streams without the need of up or down sampling to match frequency, preserving signal integrity and (ii.) introducing GradBal, a gradient-balancing mechanism that mitigates learning conflicts caused by task heterogeneity and promoting stable convergence across tasks in a distributed learning environment. Across four public benchmarks and one industrial dataset, our model reduces forecasting errors by up to 72% over the best multi-task baseline (UniTS) and up to 48% over PCGrad, a SOTA gradient conflict mitigation method. Code is made available at https://github.com/arc-arnob/TS-MTL/.

Files

License info not available