TY - JOUR
T1 - IoTSL
T2 - Towards Efficient Distributed Learning for Resource-constrained Internet of Things
AU - Feng, Xingyu
AU - Luo, Chengwen
AU - Chen, Jiongzhang
AU - Huang, Yijing
AU - Zhang, Jin
AU - Xu, Weitao
AU - Li, Jianqiang
AU - Leung, Victor C.M.
PY - 2023/6/1
Y1 - 2023/6/1
N2 - Recently proposed Split Learning (SL) is a promising distributed machine learning paradigm that enables machine learning without accessing the raw data of the clients. SL can be viewed as one specific type of serial federation learning. However, deploying SL on resource-constrained IoT devices still has some limitations, including high communication costs and catastrophic forgetting problems caused by imbalanced data distribution of devices. In this paper, we design and implement IoTSL, which is an efficient distributed learning framework for efficient cloudedge collaboration in IoT systems. IoTSL combines generative adversarial networks (GANs) and differential privacy techniques to train local data-based generators on participating devices, and generate data with privacy protection. On the one hand, IoTSL pre-trains the global model using the generative data, and then fine-tunes the model using the local data to lower the communication cost. On the other hand, the generated data is used to impute the missing classes of devices to alleviate the commonly seen catastrophic forgetting phenomenon. We use three common datasets to verify the proposed framework. Extensive experimental results show that compared to the conventional SL, IoTSL significantly reduces communication costs, and efficiently alleviates the catastrophic forgetting phenomenon. © 2023 IEEE.
AB - Recently proposed Split Learning (SL) is a promising distributed machine learning paradigm that enables machine learning without accessing the raw data of the clients. SL can be viewed as one specific type of serial federation learning. However, deploying SL on resource-constrained IoT devices still has some limitations, including high communication costs and catastrophic forgetting problems caused by imbalanced data distribution of devices. In this paper, we design and implement IoTSL, which is an efficient distributed learning framework for efficient cloudedge collaboration in IoT systems. IoTSL combines generative adversarial networks (GANs) and differential privacy techniques to train local data-based generators on participating devices, and generate data with privacy protection. On the one hand, IoTSL pre-trains the global model using the generative data, and then fine-tunes the model using the local data to lower the communication cost. On the other hand, the generated data is used to impute the missing classes of devices to alleviate the commonly seen catastrophic forgetting phenomenon. We use three common datasets to verify the proposed framework. Extensive experimental results show that compared to the conventional SL, IoTSL significantly reduces communication costs, and efficiently alleviates the catastrophic forgetting phenomenon. © 2023 IEEE.
KW - Cloud computing
KW - Computational modeling
KW - Costs
KW - Data models
KW - GANs
KW - Internet of Things
KW - Privacy Protection
KW - Resource-constrained IoT Devices
KW - Servers
KW - Split Learning
KW - Training
UR - http://www.scopus.com/inward/record.url?scp=85147281527&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85147281527&origin=recordpage
U2 - 10.1109/JIOT.2023.3235765
DO - 10.1109/JIOT.2023.3235765
M3 - RGC 21 - Publication in refereed journal
SN - 2327-4662
VL - 10
SP - 9892
EP - 9905
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 11
ER -