CASA : Clustered Federated Learning with Asynchronous Clients

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

View graph of relations

Author(s)

  • Boyi Liu
  • Yiming Ma
  • Yexuan Shi
  • Shuyuan Li
  • Yongxin Tong

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationKDD '24 - Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Place of PublicationNew York, NY
PublisherAssociation for Computing Machinery
Pages1851-1862
ISBN (print)9798400704901
Publication statusPublished - Aug 2024

Publication series

NameProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
ISSN (Print)2154-817X

Conference

Title30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2024)
LocationCentre de Convencions Internacional de Barcelona
PlaceSpain
CityBarcelona
Period25 - 29 August 2024

Abstract

Clustered Federated Learning (CFL) is an emerging paradigm to extract insights from data on IoT devices. Through iterative client clustering and model aggregation, CFL adeptly manages data heterogeneity, ensures privacy, and delivers personalized models to heterogeneous devices. Traditional CFL approaches, which operate synchronously, suffer from prolonged latency for waiting slow devices during clustering and aggregation. This paper advocates a shift to asynchronous CFL, allowing the server to process client updates as they arrive. This shift enhances training efficiency yet introduces complexities to the iterative training cycle. To this end, we present CASA, a novel CFL scheme for Clustering-Aggregation Synergy under Asynchrony. Built upon a holistic theoretical understanding of asynchrony's impact on CFL, CASA adopts a bi-level asynchronous aggregation method and a buffer-aided dynamic clustering strategy to harmonize between clustering and aggregation. Extensive evaluations on standard benchmarks show that CASA outperforms representative baselines in model accuracy and achieves 2.28-6.49× higher convergence speed. © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.

Research Area(s)

  • asynchronous federated learning, clustered federated learning, sparse training

Citation Format(s)

CASA: Clustered Federated Learning with Asynchronous Clients. / Liu, Boyi; Ma, Yiming; Zhou, Zimu et al.
KDD '24 - Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York, NY: Association for Computing Machinery, 2024. p. 1851-1862 (Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining).

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review