Journal of Beijing University of Posts and Telecommunications

  • EI核心期刊

JOURNAL OF BEIJING UNIVERSITY OF POSTS AND TELECOM ›› 2019, Vol. 42 ›› Issue (6): 64-69,104.doi: 10.13190/j.jbupt.2019-155

• Papers • Previous Articles     Next Articles

Tasks Offloading and Resource Scheduling Algorithm Based on Deep Reinforcement Learning in MEC

XUE Ning1, HUO Ru1,2, ZENG Shi-qing3, WANG Shuo2,3, HUANG Tao2,3   

  1. 1. Beijing Advanced Innovation Center for Future Internet Technology, Beijing University of Technology, Beijing 100124, China;
    2. Purple Mountain Laboratories, Nanjing 211111, China;
    3. State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Received:2019-07-11 Online:2019-12-28 Published:2019-11-15

Abstract: In order to improve the task offloading efficiency in multi-access edge computing (MEC), a joint optimization model for task offloading and heterogeneous resource scheduling was proposed, considering the heterogeneous communication resources and computing resources, jointly minimizing the energy consumption of user equipment, task execution delay, and the payment. A deep reinforcement learning method is adopted in the model to obtain the optimal offloading algorithm. Simulations show that the proposed algorithm improves the comprehensive indexes of equipment energy consumption, delay, and payment by 27.6%, compared to the Banker's algorithm.

Key words: multi-access edge computing, task offloading, heterogeneous resource scheduling, deep reinforcement learning

CLC Number: