TY - JOUR
T1 - Toward Knowledge as a Service over Networks
T2 - A Deep Learning Model Communication Paradigm
AU - Chen, Ziqian
AU - Duan, Ling-Yu
AU - Wang, Shiqi
AU - Lou, Yihang
AU - Huang, Tiejun
AU - Wu, Dapeng Oliver
AU - Gao, Wen
PY - 2019/6
Y1 - 2019/6
N2 - The advent of artificial intelligence and Internet of Things has led to the seamless transition turning the big data into the big knowledge. The deep learning models, which assimilate knowledge from large-scale data, can be regarded as an alternative but promising modality of knowledge for artificial intelligence services. Yet, the compression, storage, and communication of the deep learning models towards better knowledge services, especially over networks, pose a set of challenging problems on both industrial and academic realms. This paper presents the deep learning model communication paradigm based on multiple model compression, which greatly exploits the redundancy among multiple deep learning models in different application scenarios. We analyze the potential and demonstrate the promise of the compression strategy for deep learning model communication through a set of experiments. Moreover, the interoperability in deep learning model communication, which is enabled based on the standardization of compact deep learning model representation, is also discussed and envisioned.
AB - The advent of artificial intelligence and Internet of Things has led to the seamless transition turning the big data into the big knowledge. The deep learning models, which assimilate knowledge from large-scale data, can be regarded as an alternative but promising modality of knowledge for artificial intelligence services. Yet, the compression, storage, and communication of the deep learning models towards better knowledge services, especially over networks, pose a set of challenging problems on both industrial and academic realms. This paper presents the deep learning model communication paradigm based on multiple model compression, which greatly exploits the redundancy among multiple deep learning models in different application scenarios. We analyze the potential and demonstrate the promise of the compression strategy for deep learning model communication through a set of experiments. Moreover, the interoperability in deep learning model communication, which is enabled based on the standardization of compact deep learning model representation, is also discussed and envisioned.
KW - Deep learning
KW - deep learning model communication
KW - knowledge centric network
KW - neural network compression
UR - http://www.scopus.com/inward/record.url?scp=85065913555&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85065913555&origin=recordpage
U2 - 10.1109/JSAC.2019.2904360
DO - 10.1109/JSAC.2019.2904360
M3 - RGC 21 - Publication in refereed journal
SN - 0733-8716
VL - 37
SP - 1349
EP - 1363
JO - IEEE Journal on Selected Areas in Communications
JF - IEEE Journal on Selected Areas in Communications
IS - 6
ER -