YC

Y. Chen

103 records found

Distributed time series data presents a challenge for federated learning, as clients often possess different feature sets and have misaligned time steps. Existing federated time series models are limited by the assumption of perfect temporal or feature alignment across clients. I ...

BatMan-CLR

Making Few-Shots Meta-learners Resilient Against Label Noise

The negative impact of label noise is well studied in classical supervised learning yet remains an open research question in meta-learning. Meta-learners aim to adapt to unseen tasks by learning a good initial model in meta-training and fine-tuning it to new tasks during meta-tes ...
While diffusion models effectively generate remarkable synthetic images, a key limitation is the inference inefficiency, requiring numerous sampling steps. To accelerate inference and maintain high-quality synthesis, teacher-student distillation is applied to compress the diffusi ...

Loci

Federated Continual Learning of Heterogeneous Tasks at Edge

Federated continual learning (FCL) has attracted growing attention in achieving collaborative model training among edge clients, each of which learns its local model for a sequence of tasks. Most existing FCL approaches aggregate clients' latest local models to exchange knowledge ...
Edge applications are increasingly empowered by deep neural networks (DNN) and face the challenges of adapting or retraining models for the changes in input data domains and learning tasks. The existing techniques to enable DNN retraining on edge devices are to configure the memo ...
Federated Learning (FL) systems evolve in heterogeneous and ever-evolving environments that challenge their performance. Under real deployments, the learning tasks of clients can also evolve with time, which calls for the integration of methodologies such as Continual Learning (C ...

EdgeTA

Neuron-Grained Scaling of Foundation Models in Edge-Side Retraining

Foundation models (FMs) such as large language models are becoming the backbone technology for artificial intelligence systems. It is particularly challenging to deploy multiple FMs on edge devices, which not only have limited computational resources, but also encounter unseen in ...

GIDM

Gradient Inversion of Federated Diffusion Models

Diffusion models are becoming the most prevalent generative models, producing exceptional high-quality image data through a stochastic process of diffusion steps based on Gaussian noises. Recent studies explore the federated training of diffusion models, enabling the collaborativ ...

Spyker

Asynchronous Multi-Server Federated Learning for Geo-Distributed Clients

Federated learning (FL) systems enable multiple clients to train a machine learning model iteratively through synchronously exchanging the intermediate model weights with a single server. The scalability of such FL systems can be limited by two factors: server idle time due to sy ...
Generative Adversarial Networks (GANs) are increasingly adopted by the industry to synthesize realistic images using competing generator and discriminator neural networks. Due to data not being centrally available, Multi-Discriminator (MD)-GANs training frameworks employ multiple ...

RobustDA

Lightweight Robust Domain Adaptation for Evolving Data at Edge

AI applications powered by deep learning models are increasingly run natively at edge. A deployed model not only encounters continuously evolving input distributions (domains) but also faces adversarial attacks from third-party. This necessitates adapting the model to shifting do ...
In a decade, AI frontier research transitioned from the researcher's workstation to thousands of high-end hardware-accelerated compute nodes. This rapid evolution shows no signs of slowing down in the foreseeable future. While top cloud providers may be able to keep pace with thi ...

ElasticDNN

On-Device Neural Network Remodeling for Adapting Evolving Vision Domains at Edge

Executing deep neural networks (DNN) based vision tasks on edge devices encounters challenging scenarios of significant and continually evolving data domains (e.g. background or subpopulation shift). With limited resources, the state-of-the-art domain adaptation (DA) methods eith ...

FedViT

Federated continual learning of vision transformer at edge

Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral part of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually re ...

SiloFuse

Cross-silo Synthetic Data Generation with Latent Tabular Diffusion Models

Synthetic tabular data is crucial for sharing and augmenting data across silos, especially for enterprises with proprietary data. However, existing synthesizers are designed for centrally stored data. Hence, they struggle with real-world scenarios where features are distributed a ...

Amalur

The Convergence of Data Integration and Machine Learning

Machine learning (ML) training data is often scattered across disparate collections of datasets, called <italic>data silos</italic>. This fragmentation poses a major challenge for data-intensive ML applications: integrating and transforming data residing in different ...

FCT-GAN

Enhancing Global Correlation of Table Synthesis via Fourier Transform

An alternative method for sharing knowledge while complying with strict data access regulations, such as the European General Data Protection Regulation (GDPR), is the emergence of synthetic tabular data. Mainstream table synthesizers utilize methodologies derived from Generative ...
Federated learning is a private-by-design distributed learning paradigm where clients train local models on their own data before a central server aggregates their local updates to compute a global model. Depending on the aggregation method used, the local updates are either the ...

Fabricated Flips

Poisoning Federated Learning without Data

Attacks on Federated Learning (FL) can severely reduce the quality of the generated models and limit the usefulness of this emerging learning paradigm that enables on-premise decentralized learning. However, existing untargeted attacks are not practical for many scenarios as they ...

EdgeVisionBench

A Benchmark of Evolving Input Domains for Vision Applications at Edge

Vision applications powered by deep neural networks (DNNs) are widely deployed on edge devices and solve the learning tasks of incoming data streams whose class label and input feature continuously evolve, known as domain shift. Despite its prominent presence in real-world edge s ...