Reinforcement Learning for Delay-Constrained Energy-Aware Small Cells with Multi-Sleeping Control
Main Authors: | Dini, Paolo, El Amine, Ali, Nuaymi, Loutfi |
---|---|
Format: | Proceeding eJournal |
Bahasa: | eng |
Terbitan: |
, 2020
|
Subjects: | |
Online Access: |
https://zenodo.org/record/4118122 |
Daftar Isi:
- In 5G networks, specific requirements are defined on the periodicity of Synchronization Signaling (SS) bursts. This imposes a constraint on the maximum period a Base Station (BS) can be deactivated. On the other hand, BS densification is expected in 5G architecture. This will cause a drastic increase in the network energy consumption followed by a complex interference management. In this paper, we study the Energy-Delay-Tradeoff (EDT) problem in a Heterogeneous Network (HetNet) where small cells can switch to different sleep mode levels to save energy while maintaining a good Quality of Service (QoS). We propose a distributed Q-learning algorithm controller for small cells that adapts the cell activity while taking into account the co-channel interference between the cells. Our numerical results show that multi-level sleep scheme outperforms binary sleep scheme with an energy saving up to 80% in the case when the users are delay tolerant, and while respecting the periodicity of the SS bursts in 5G.
- Grant numbers : 5G-REFINE - Resource EfFIcient 5G NEtworks (TEC2017-88373-R).© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.