CASA : Clustered Federated Learning with Asynchronous Clients
Research output: Chapters, Conference Papers, Creative and Literary Works › RGC 32 - Refereed conference paper (with host publication) › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Title of host publication | KDD '24 - Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining |
Place of Publication | New York, NY |
Publisher | Association for Computing Machinery |
Pages | 1851-1862 |
ISBN (print) | 9798400704901 |
Publication status | Published - Aug 2024 |
Publication series
Name | Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining |
---|---|
ISSN (Print) | 2154-817X |
Conference
Title | 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2024) |
---|---|
Location | Centre de Convencions Internacional de Barcelona |
Place | Spain |
City | Barcelona |
Period | 25 - 29 August 2024 |
Link(s)
Abstract
Clustered Federated Learning (CFL) is an emerging paradigm to extract insights from data on IoT devices. Through iterative client clustering and model aggregation, CFL adeptly manages data heterogeneity, ensures privacy, and delivers personalized models to heterogeneous devices. Traditional CFL approaches, which operate synchronously, suffer from prolonged latency for waiting slow devices during clustering and aggregation. This paper advocates a shift to asynchronous CFL, allowing the server to process client updates as they arrive. This shift enhances training efficiency yet introduces complexities to the iterative training cycle. To this end, we present CASA, a novel CFL scheme for Clustering-Aggregation Synergy under Asynchrony. Built upon a holistic theoretical understanding of asynchrony's impact on CFL, CASA adopts a bi-level asynchronous aggregation method and a buffer-aided dynamic clustering strategy to harmonize between clustering and aggregation. Extensive evaluations on standard benchmarks show that CASA outperforms representative baselines in model accuracy and achieves 2.28-6.49× higher convergence speed. © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
Research Area(s)
- asynchronous federated learning, clustered federated learning, sparse training
Citation Format(s)
CASA: Clustered Federated Learning with Asynchronous Clients. / Liu, Boyi; Ma, Yiming; Zhou, Zimu et al.
KDD '24 - Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York, NY: Association for Computing Machinery, 2024. p. 1851-1862 (Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining).
KDD '24 - Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York, NY: Association for Computing Machinery, 2024. p. 1851-1862 (Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining).
Research output: Chapters, Conference Papers, Creative and Literary Works › RGC 32 - Refereed conference paper (with host publication) › peer-review