AoI-Aware Inference Services in Edge Computing via Digital Twin Network Slicing
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Journal / Publication | IEEE Transactions on Services Computing |
Publication status | Online published - 2 Aug 2024 |
Link(s)
Abstract
The advance of Digital Twin (DT) technology sheds light on seamless cyber-physical integration with the Industry 4.0 initiative. Through continuous synchronization with their physical objects, DTs can power inference service models for analysis, emulation, optimization, and prediction on physical objects. With the proliferation of DTs, Digital Twin Network (DTN) slicing is emerging as a new paradigm of service providers for differential quality of service provisioning, where each DTN is a virtual network that consists of a set of inference service models with source data from a group of DTs, and the inference service models provide users with differential quality of services. Mobile Edge Computing (MEC) as a new computing paradigm shifts the computing power towards the edge of core networks, which is appropriate for delay-sensitive inference services. In this paper we consider Age of Information (AoI)-aware inference service provisioning in an MEC network through DTN slicing requests, where the accuracy of inference services provided by each DTN slice is determined by the Expected Age of Information (EAoI) of its inference model. Specifically, we first introduce a novel AoI-aware inference service framework of DTN slicing requests. We then formulate the expected cost minimization problem by jointly placing DT and inference service model instances, and develop efficient algorithms for the problem, based on the proposed framework. We also consider dynamic DTN slicing request admissions where requests arrive one by one without the knowledge of future arrivals, for which we devise an online algorithm with a provable competitive ratio for dynamic request admissions, assuming that DTs of all objects have been placed already. Finally, we evaluate the performance of the proposed algorithms through simulations. Simulation results demonstrate that the proposed algorithms are promising. © 2024 IEEE.
Research Area(s)
- Computational modeling, cost modeling, Costs, Data models, Digital twin network slicing, DT and inference service instance placements, Heuristic algorithms, Inference algorithms, inference service models, mobile edge computing, Network slicing, Resource management, the expected age of information (EAoI)
Citation Format(s)
AoI-Aware Inference Services in Edge Computing via Digital Twin Network Slicing. / Zhang, Yuncan; Liang, Weifa; Xu, Zichuan et al.
In: IEEE Transactions on Services Computing, 02.08.2024.
In: IEEE Transactions on Services Computing, 02.08.2024.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review