FedKNOW

Federated Continual Learning with Signature Task Knowledge Integration at Edge

Conference Paper (2023)
Author(s)

Yaxin Luopan (Beijing Institute of Technology)

Rui Han (Beijing Institute of Technology)

Qinglong Zhang (Beijing Institute of Technology)

Chi Harold Liu (Beijing Institute of Technology)

Guoren Wang (Beijing Institute of Technology)

Lydia Y. Chen (TU Delft - Data-Intensive Systems)

Research Group
Data-Intensive Systems
Copyright
© 2023 Yaxin Luopan, Rui Han, Qinglong Zhang, Chi Harold Liu, Guoren Wang, Lydia Y. Chen
DOI related publication
https://doi.org/10.1109/ICDE55515.2023.00033
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Yaxin Luopan, Rui Han, Qinglong Zhang, Chi Harold Liu, Guoren Wang, Lydia Y. Chen
Research Group
Data-Intensive Systems
Pages (from-to)
341-354
ISBN (print)
979-8-3503-2228-6
ISBN (electronic)
979-8-3503-2227-9
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually retrain themselves according to the tasks on different edge devices. Federated continual learning is a promising technique that offers partial solutions but yet to overcome the following difficulties: the significant accuracy loss due to the limited on-device processing, the negative knowledge transfer caused by the limited communication of non-IID data, and the limited scalability on the tasks and edge devices. In this paper, we propose FedKNOW, an accurate and scalable federated continual learning framework, via a novel concept of signature task knowledge. FedKNOW is a client side solution that continuously extracts and integrates the knowledge of signature tasks which are highly influenced by the current task. Each client of FedKNOW is composed of a knowledge extractor, a gradient restorer and, most importantly, a gradient integrator. Upon training for a new task, the gradient integrator ensures the prevention of catastrophic forgetting and mitigation of negative knowledge transfer by effectively combining signature tasks identified from the past local tasks and other clients' current tasks through the global model. We implement FedKNOW in PyTorch and extensively evaluate it against state-of-the-art techniques using popular federated continual learning benchmarks. Extensive evaluation results on heterogeneous edge devices show that FedKNOW improves model accuracy by 63.24% without increasing model training time, reduces communication cost by 34.28%, and achieves more improvements under difficult scenarios such as large numbers of tasks or clients, and training different complex networks.

Files

FedKNOW_Federated_Continual_Le... (pdf)
(pdf | 2.84 Mb)
- Embargo expired in 26-01-2024
License info not available