Journal of Beijing University of Posts and Telecommunications

  • EI核心期刊

Journal of Beijing University of Posts and Telecommunications ›› 2020, Vol. 43 ›› Issue (2): 87-93.doi: 10.13190/j.jbupt.2019-103

• PAPERS • Previous Articles     Next Articles

A Integrated Energy Service Channel Optimization Mechanism Based on Deep Reinforcement Learning

MA Qing-liu1, YU Peng1, WU Jia-hui1, XIONG Ao1, YAN Yong2   

  1. 1. State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China;
    2. State Grid Zhejiang Electric Power Company Limited, Hangzhou 310007, China
  • Received:2019-05-31 Published:2020-04-28

Abstract: In order to ensure the stable operation of the integrated energy system, the integrated energy service needs to have high reliability and low risk when being carried by the communication network. According to the channel requirements of the integrated energy service, an algorithm of deep reinforcement learning is proposed, aiming to find the overall optimal path for the large-scale integrated energy service on the carried power communication network. The method that aims at the overall delay and network load balance, trains the network topology and saves the model, and then obtains the optimal result through iterative learning. The simulation results show that the routing found by this method can ensure the overall delay is short and guarantee the overall load balance of the network. At the same time, for scenarios with a large network size and a large number of services, the deep reinforcement learning algorithm can effectively improve the computational efficiency.

Key words: deep reinforcement learning, routing optimization, time delay, load balancing

CLC Number: