Clustered Federated Multi-Task Learning with Non-IID Data
Research output: Chapters, Conference Papers, Creative and Literary Works › RGC 32 - Refereed conference paper (with host publication) › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Title of host publication | Proceedings - 2021 IEEE 27th International Conference on Parallel and Distributed Systems |
Subtitle of host publication | ICPADS 2021 |
Publisher | Institute of Electrical and Electronics Engineers, Inc. |
Pages | 50-57 |
ISBN (electronic) | 9781665408783 |
ISBN (print) | 978-1-6654-0879-0 |
Publication status | Published - Dec 2021 |
Publication series
Name | Proceedings of the International Conference on Parallel and Distributed Systems - ICPADS |
---|---|
ISSN (Print) | 1521-9097 |
ISSN (electronic) | 2690-5965 |
Conference
Title | 27th IEEE International Conference on Parallel and Distributed Systems (ICPADS 2021) |
---|---|
Location | Jiuhua International Convention and Exhibition Center Hotel |
Place | China |
City | Beijing |
Period | 14 - 16 December 2021 |
Link(s)
Abstract
Federated Learning enables the collaborative learning in cross-client scenarios while keeping the clients' data local for privacy. The presence of non-IID data is one of major challenges in federated learning. To deal with this statistic challenge, federated multi-task learning considers the local training for each client as a single task. However, all the clients must participate in each training round, and it is inapplicable to mobile or IOT devices with constrained communication capability. To achieve the communication-efficiency and high accuracy with non-IID data, we propose a clustered federated multi-task learning by exploring client clustering and multi-task learning. We measure the similarities of local data among clients indirectly through their models' parameters, and design a client clustering strategy to enable clients with similar data distribution into a same group. The limitation of full-participation can be eliminated through the way of model training for groups instead of individual clients. The convergence analysis and experimental evaluation on real-world datasets shows that our work outperforms the basic federated learning in accuracy and is also more communication-efficient than the existing federated multi-task learning.
Research Area(s)
- clustering, Federated learning, multi-task learning, non-IID data
Citation Format(s)
Clustered Federated Multi-Task Learning with Non-IID Data. / Xiao, Yao; Shu, Jiangang; Jia, Xiaohua et al.
Proceedings - 2021 IEEE 27th International Conference on Parallel and Distributed Systems: ICPADS 2021. Institute of Electrical and Electronics Engineers, Inc., 2021. p. 50-57 (Proceedings of the International Conference on Parallel and Distributed Systems - ICPADS).
Proceedings - 2021 IEEE 27th International Conference on Parallel and Distributed Systems: ICPADS 2021. Institute of Electrical and Electronics Engineers, Inc., 2021. p. 50-57 (Proceedings of the International Conference on Parallel and Distributed Systems - ICPADS).
Research output: Chapters, Conference Papers, Creative and Literary Works › RGC 32 - Refereed conference paper (with host publication) › peer-review