A Framework of Large-Scale Peer-to-Peer Learning System

Yongkang Luo, Peiyi Han*, Wenjian Luo, Shaocong Xue, Kesheng Chen, Linqi Song

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

1 Citation (Scopus)

Abstract

Federated learning (FL) is a distributed machine learning paradigm in which numerous clients train a model dispatched by a central server while retaining the training data locally. Nonetheless, the failure of the central server can disrupt the training framework. Peer-to-peer approaches enhance the robustness of system as all clients directly interact with other clients without a server. However, a downside of these peer-to-peer approaches is their low efficiency. Communication among a large number of clients is significantly costly, and the synchronous learning framework becomes unworkable in the presence of stragglers. In this paper, we propose a semi-asynchronous peer-to-peer learning system (P2PLSys) suitable for large-scale clients. This system features a server that manages all clients but does not participate in model aggregation. The server distributes a partial client list to selected clients that have completed local training for local model aggregation. Subsequently, clients adjust their own models based on staleness and communicate through a secure multi-party computation protocol for secure aggregation. Through our experiments, we demonstrate the effectiveness of P2PLSys for image classification problems, achieving a similar performance level to classical FL algorithms and centralized training. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
Original languageEnglish
Title of host publicationNeural Information Processing
Subtitle of host publication30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part II
EditorsBiao Luo, Long Cheng, Zheng-Guang Wu, Hongyi Li, Chaojie Li
PublisherSpringer 
Pages27-41
ISBN (Electronic)978-981-99-8082-6
ISBN (Print)978-981-99-8081-9
DOIs
Publication statusPublished - 2024
Event30th International Conference on Neural Information Processing (ICONIP 2023) - Changsha, China
Duration: 20 Nov 202323 Nov 2023
http://iconip2023.org/

Publication series

NameLecture Notes in Computer Science
Volume14448
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference30th International Conference on Neural Information Processing (ICONIP 2023)
Abbreviated titleICONIP2023
PlaceChina
CityChangsha
Period20/11/2323/11/23
Internet address

Research Keywords

  • Federated learning
  • Peer-to-peer learning system
  • Semi-asynchronous learning

Fingerprint

Dive into the research topics of 'A Framework of Large-Scale Peer-to-Peer Learning System'. Together they form a unique fingerprint.

Cite this