Back to Search Start Over

QoE-Driven Edge Caching in Vehicle Networks Based on Deep Reinforcement Learning.

Authors :
Song, Chunhe
Xu, Wenxiang
Wu, Tingting
Yu, Shimao
Zeng, Peng
Zhang, Ning
Source :
IEEE Transactions on Vehicular Technology; Jun2021, Vol. 70 Issue 6, p5286-5295, 10p
Publication Year :
2021

Abstract

The Internet of vehicles (IoV) is a large information interaction network that collects information on vehicles, roads and pedestrians. One of the important uses of vehicle networks is to meet the entertainment needs of driving users through communication between vehicles and roadside units (RSUs). Due to the limited storage space of RSUs, determining the content cached in each RSU is a key challenge. With the development of 5G and video editing technology, short video systems have become increasingly popular. Current widely used cache update methods, such as partial file precaching and content popularity- and user interest-based determination, are inefficient for such systems. To solve this problem, this paper proposes a QoE-driven edge caching method for the IoV based on deep reinforcement learning. First, a class-based user interest model is established. Compared with the traditional file popularity- and user interest distribution-based cache update methods, the proposed method is more suitable for systems with a large number of small files. Second, a quality of experience (QoE)-driven RSU cache model is established based on the proposed class-based user interest model. Third, a deep reinforcement learning method is designed to address the QoE-driven RSU cache update issue effectively. The experimental results verify the effectiveness of the proposed algorithm. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189545
Volume :
70
Issue :
6
Database :
Complementary Index
Journal :
IEEE Transactions on Vehicular Technology
Publication Type :
Academic Journal
Accession number :
152969366
Full Text :
https://doi.org/10.1109/TVT.2021.3077072