Back to Search Start Over

Online optimization for low-latency computational caching in Fog networks

Authors :
Walid Saad
Mehdi Bennis
Gilsoo Lee
Source :
FWC
Publication Year :
2017
Publisher :
IEEE, 2017.

Abstract

Enabling effective computation for emerging applications such as augmented reality or virtual reality via fog computing requires processing data with low latency. In this paper, a novel computational caching framework is proposed to minimize fog latency by storing and reusing intermediate computation results (IRs). Using this proposed paradigm, a fog node can store IRs from previous computations and can also download IRs from neighboring nodes at the expense of additional transmission latency. However, due to the unpredictable arrival of the future computational operations and the limited memory size of the fog node, it is challenging to properly maintain the set of stored IRs. Thus, under uncertainty of future computation, the goal of the proposed framework is to minimize the sum of the transmission and computational latency by selecting the IRs to be downloaded and stored. To solve the problem, an online computational caching algorithm is developed to enable the fog node to schedule, download, and manage IRs compute arriving operations. Competitive analysis is used to derive the upper bound of the competitive ratio for the online algorithm. Simulation results show that the total latency can be reduced up to 26.8% by leveraging the computational caching method when compared to the case without computational caching.

Details

Database :
OpenAIRE
Journal :
2017 IEEE Fog World Congress (FWC)
Accession number :
edsair.doi.dedup.....972dc639d518a76c7727728dceeed274