Back to Search Start Over

Deep learning-based energy-efficient computational offloading strategy in heterogeneous fog computing networks.

Authors :
Sarkar, Indranil
Kumar, Sanjay
Source :
Journal of Supercomputing; Sep2022, Vol. 78 Issue 13, p15089-15106, 18p
Publication Year :
2022

Abstract

In the era of the Internet of Things (IoT), the volume of data is increasing immensely causing rapid growth in network data communication and data congestion. Computational offloading thus becomes a crucial and imperative action in terms of delay-sensitive task completion and data processing for the resource constraint end-users. Nowadays fog computing, as a complement of cloud computing, has emerged a well-known concept in terms of enhancing data processing capability as well as energy conservation in low-powered networks. In this paper, we consider a heterogeneous fog-cloud network architecture where the data processing is performed on the local or remote computing device by adopting a binary offloading policy. Based on the proposed system model, we calculate the total delay and energy consumption of data processing throughout the network and formulate a mixed-integer optimization problem to jointly optimize the offloading decision and bandwidth allocation. In order to solve such an NP-hard problem, we have proposed a deep-learning-based binary offloading strategy that employs multiple parallel deep neural networks (DNNs) to make offloading decisions. Such offloading decisions are subsequently placed in a relay memory system to train and test all DNNs. Simulation results show a near-optimal performance of the proposed offloading strategy while remarkably maintaining the quality of service by decreasing overall delay and energy consumption. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09208542
Volume :
78
Issue :
13
Database :
Complementary Index
Journal :
Journal of Supercomputing
Publication Type :
Academic Journal
Accession number :
158432410
Full Text :
https://doi.org/10.1007/s11227-022-04461-z