Communication Efficient Federated Learning With Heterogeneous Structured Client Models

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

View graph of relations



Original languageEnglish
Pages (from-to)753-767
Number of pages15
Journal / PublicationIEEE Transactions on Emerging Topics in Computational Intelligence
Issue number3
Online published5 Oct 2022
Publication statusPublished - Jun 2023


Federated learning (FL) has recently attracted much attention due to its superior performance in privacy protection when processing data from different terminals. However, homogeneous deep learning models are pervasively adopted without considering the difference between distinct data in various clients, resulting in low learning performance and high communication costs. This paper thus proposes a novel FL framework with heterogeneous structured client models for handling different data scales and investigates its superiority over canonical FL with homogeneous models. Additionally, singular value decomposition is adopted on the client models to reduce the amount of transmitted data, i.e., the communication costs. The aggregation mechanism with multiple models on the central server is then presented based on the heterogeneous characteristics of the uploaded parameters and models. The proposed framework is applied to four benchmark classification datasets and a trend following task on electromagnetic radiation intensity time series data. Experimental results demonstrate that the proposed method can effectively improve the accuracy of local learning models and significantly reduce communication costs. © 2022 IEEE.

Research Area(s)

  • Servers, Costs, Matrix decomposition, Training, Data models, Optimization, Data privacy, Federated learning, heterogeneous structured model, neural network, singular value decomposition, FACTORIZATION, SYSTEMS