Searched for: %2520
(1 - 15 of 15)
document
Zuo, Xiaojiang (author), Luopan, Yaxin (author), Han, Rui (author), Zhang, Qinglong (author), Liu, Chi Harold (author), Wang, Guoren (author), Chen, Lydia Y. (author)
Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral part of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually retrain themselves according to the tasks on different edge...
journal article 2024
document
Grataloup, Albin (author), Jonas, Stefan (author), Meyer, A. (author)
Federated learning has recently emerged as a privacy-preserving distributed machine learning approach. Federated learning enables collaborative training of multiple clients and entire fleets without sharing the involved training datasets. By preserving data privacy, federated learning has the potential to overcome the lack of data sharing in...
review 2024
document
Huang, J. (author), Zhao, Z. (author), Chen, Lydia Y. (author), Roos, S. (author)
Attacks on Federated Learning (FL) can severely reduce the quality of the generated models and limit the usefulness of this emerging learning paradigm that enables on-premise decentralized learning. However, existing untargeted attacks are not practical for many scenarios as they assume that i) the attacker knows every update of benign...
conference paper 2023
document
Zhao, Z. (author), Birke, Robert (author), Chen, Lydia Y. (author)
Generative Adversarial Networks (GANs) are typically trained to synthesize data, from images and more recently tabular data, under the assumption of directly accessible training data. While learning image GANs on Federated Learning (FL) and Multi-Discriminator (MD) systems has just been demonstrated, it is unknown if tabular GANs can be learned...
conference paper 2023
document
Zheng, Jingjing (author), Li, Kai (author), Mhaisen, N. (author), Ni, Wei (author), Tovar, Eduardo (author), Guizani, Mohsen (author)
Federated learning (FL) is increasingly considered to circumvent the disclosure of private data in mobile edge computing (MEC) systems. Training with large data can enhance FL learning accuracy, which is associated with non-negligible energy use. Scheduled edge devices with small data save energy but decrease FL learning accuracy due to a...
conference paper 2023
document
van Helden, G. (author), Zandbergen, B.T.C. (author), Specht, M.M. (author), Gill, E.K.A. (author)
Contribution: This article presents a comprehensive overview of characteristics of educational designs of collaborative engineering design activities found in literature and how these characteristics mediate students' collaboration. Background: Engineers have to solve complex problems that require collaboration. In education, various...
journal article 2023
document
Luopan, Yaxin (author), Han, Rui (author), Zhang, Qinglong (author), Liu, Chi Harold (author), Wang, Guoren (author), Chen, Lydia Y. (author)
Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually retrain themselves according to the tasks on different edge devices....
conference paper 2023
document
Huang, J. (author), Hong, C. (author), Liu, Yang (author), Chen, Lydia Y. (author), Roos, S. (author)
Federated learning (FL) enables collaborative learning between parties, called clients, without sharing the original and potentially sensitive data. To ensure fast convergence in the presence of such heterogeneous clients, it is imperative to timely select clients who can effectively contribute to learning. A realistic but overlooked case of...
conference paper 2023
document
Cox, B.A. (author), Chen, Lydia Y. (author), Decouchant, Jérémie (author)
Federated Learning (FL) is a popular deep learning approach that prevents centralizing large amounts of data, and instead relies on clients that update a global model using their local datasets. Classical FL algorithms use a central federator that, for each training round, waits for all clients to send their model updates before aggregating them...
conference paper 2022
document
Zheng, Jingjing (author), Li, Kai (author), Mhaisen, N. (author), Ni, Wei (author), Tovar, Eduardo (author), Guizani, Mohsen (author)
Federated learning (FL) has been increasingly considered to preserve data training privacy from eavesdropping attacks in mobile-edge computing-based Internet of Things (EdgeIoT). On the one hand, the learning accuracy of FL can be improved by selecting the IoT devices with large data sets for training, which gives rise to a higher energy...
journal article 2022
document
Ottun, Abdul Rasheed (author), Mane, Pramod C. (author), Yin, Zhigang (author), Paul, Souvik (author), Liyanage, Mohan (author), Pridmore, Jason (author), Ding, Aaron Yi (author), Sharma, Rajesh (author), Nurmi, Petteri (author), Flores, Huber (author)
Federated learning (FL) is a promising privacy-preserving solution to build powerful AI models. In many FL scenarios, such as healthcare or smart city monitoring, the user's devices may lack the required capabilities to collect suitable data, which limits their contributions to the global model. We contribute social-aware federated learning...
journal article 2022
document
Wu, Han (author), Zhao, Z. (author), Chen, Lydia Y. (author), van Moorsel, Aad (author)
Federated Learning (FL) has emerged as a potentially powerful privacy-preserving machine learning method-ology, since it avoids exchanging data between participants, but instead exchanges model parameters. FL has traditionally been applied to image, voice and similar data, but recently it has started to draw attention from domains including...
conference paper 2022
document
Jamali-Rad, H. (author), Abdizadeh, Mohammad (author), Sing, Anuj (author)
Classical federated learning approaches incur significant performance degradation in the presence of non-independent and identically distributed (non-IID) client data. A possible direction to address this issue is forming clusters of clients with roughly IID data. Most solutions following this direction are iterative and relatively slow, also...
journal article 2022
document
He, Daojing (author), Du, Runmeng (author), Zhu, Shanshan (author), Zhang, Min (author), Liang, K. (author), Chan, Sammy (author)
Data island effectively blocks the practical application of machine learning. To meet this challenge, a new framework known as federated learning was created. It allows model training on a large amount of scattered data owned by different data providers. This article presents a parallel solution for computing logistic regression based on...
journal article 2022
document
Zhao, Jianxin (author), Han, Rui (author), Yang, Yongkai (author), Catterall, Benjamin (author), Liu, Chi Harold (author), Chen, Lydia Y. (author), Mortier, Richard (author), Crowcroft, Jon (author), Wang, Liang (author)
With the massive amount of data generated from mobile devices and the increase of computing power of edge devices, the paradigm of Federated Learning has attracted great momentum. In federated learning, distributed and heterogeneous nodes collaborate to learn model parameters. However, while providing benefits such as privacy by design and...
journal article 2022
Searched for: %2520
(1 - 15 of 15)