With the increase of users in the Internet of Vehicles (IoV), various heterogeneous user demands are also increasing. The current contradiction in the development of Vehicle Edge Computing (VEC) is how to satisfy all kinds of heterogeneous task requirements in the dynamically changing channel environment. This paper proposes an efficient collaborative scheme for demand-aware terminals, based on spectrum-sharing techniques and Deep Reinforcement Learning (DRL) algorithms, to dynamically satisfy the...
With the increase of users in the Internet of Vehicles (IoV), various heterogeneous user demands are also increasing. The current contradiction in the development of Vehicle Edge Computing (VEC) is how to satisfy all kinds of heterogeneous task requirements in the dynamically changing channel environment. This paper proposes an efficient collaborative scheme for demand-aware terminals, based on spectrum-sharing techniques and Deep Reinforcement Learning (DRL) algorithms, to dynamically satisfy the demands of heterogeneous tasks. Specifically, the delay and energy consumption of two types of tasks are modeled and a multi-objective optimization problem is constructed. Thereafter, we propose a heuristic algorithm to determine the suboptimal solution for optimization variables. Furthermore, we use the DRL algorithm to realize the purpose of dynamically allocating resources according to the channel state. Finally, the scheme’s effectiveness is verified through extensive simulation experiments.
DRL-Based Terminal Collaboration in Vehicular Edge Computing
wu, sijun; Liang Yang, Hunan University
View more