DECENTRALIZED TRAFFIC REGULATION IN ADVERTISING NETWORKS USING ENERGY-AWARE HIERARCHICAL DEEP REINFORCEMENT LEARNING
Volume 2, Issue 2, Pp 28-34, 2025
DOI: https://doi.org/10.61784/adsj3022
Author(s)
JingYi Cao
Affiliation(s)
Beijing University of Technology, Beijing 100124, China.
Corresponding Author
JingYi Cao
ABSTRACT
Online advertising networks face increasing challenges in traffic regulation due to the decentralized nature of ad serving, fluctuating demand patterns, and growing energy consumption concerns. Traditional centralized traffic management approaches fail to scale effectively across distributed advertising infrastructures while struggling to balance Quality of Service (QoS) requirements with energy efficiency constraints. The heterogeneous nature of advertising traffic, including display ads, video content, and real-time bidding requests, requires sophisticated regulation mechanisms that can adapt to varying workload characteristics and network conditions. This study proposes an Energy-Aware Hierarchical Deep Reinforcement Learning (EA-HDRL) framework for decentralized traffic regulation in advertising networks. The framework employs a multi-tier architecture where regional controllers manage local traffic optimization while a global coordinator ensures network-wide efficiency and energy conservation. Deep Q-Networks (DQNs) and Proximal Policy Optimization (PPO) algorithms enable adaptive traffic regulation policies that simultaneously optimize throughput, latency, and energy consumption across distributed advertising infrastructure.Experimental evaluation using real-world advertising network traces demonstrates that the proposed framework achieves 52% improvement in traffic throughput while reducing energy consumption by 41% compared to traditional centralized regulation methods. The hierarchical approach successfully balances local optimization autonomy with global coordination requirements, resulting in 36% better QoS compliance and 28% reduction in network congestion incidents.
KEYWORDS
Decentralized traffic regulation; Advertising networks; Energy-aware computing; Hierarchical deep reinforcement learning; Deep Q-Networks; Network optimization; Quality of service; Energy efficiency
CITE THIS PAPER
JingYi Cao. Decentralized traffic regulation in advertising networks using energy-aware hierarchical deep reinforcement learning. AI and Data Science Journal. 2025, 2(2): 28-34. DOI: https://doi.org/10.61784/adsj3022.
REFERENCES
[1] Barakabitze A A, Barman N, Ahmad A, et al. QoE management of multimedia streaming services in future networks: A tutorial and survey. IEEE Communications Surveys & Tutorials, 2019, 22(1): 526-565.
[2] Xing S, Wang Y Proactive Data Placement in Heterogeneous Storage Systems via Predictive Multi-Objective Reinforcement Learning. IEEE Access, 2025.
[3] Hodaei A, Babaie S. A survey on traffic management in software-defined networks: challenges, effective approaches, and potential measures. Wireless Personal Communications, 2021, 118(2): 1507-1534.
[4] Cao J, Zheng W, Ge Y, et al. DriftShield: Autonomous Fraud Detection via Actor-Critic Reinforcement Learning with Dynamic Feature Reweighting. IEEE Open Journal of the Computer Society, 2025.
[5] Tache M D, Pascutoiu O, Borcoci E. Optimization algorithms in SDN: Routing, load balancing, and delay optimization. Applied Sciences, 2024, 14(14): 5967.
[6] Zhang H, Ge Y, Zhao X, et al. Hierarchical Deep Reinforcement Learning for Multi-Objective Integrated Circuit Physical Layout Optimization with Congestion-Aware Reward Shaping. IEEE Access, 2025.
[7] Jain T K, Jain N. Service quality in the energy sector and its impact on sustainability. In Affordable and Clean Energy, Cham: Springer International Publishing, 2020: 1-9.
[8] Mai N, Cao W. Personalized Learning and Adaptive Systems: AI-Driven Educational Innovation and Student Outcome Enhancement. International Journal of Education and Humanities, 2025.
[9] Onifade A Y, Ogeawuchi J C, Abayomi A A. A Conceptual Framework for Cost Optimization in IT Infrastructure Using Resource Monitoring Tool, 2023.
[10] Garmani H, El Amrani M, Omar D A, et al. Analysis of Interactions Among ISPs in Information Centric Network with Advertiser Involvement. Infocommunications Journal, 2024: 16(4).
[11] Santoso B. Predictive Traffic Regulation Methodologies Using 5G-Enhanced Sensor Fusion Across Vehicle and Drone Platforms. International Journal of Applied Machine Learning, 2024, 4(12): 1-15.
[12] Rhanizar A, El Akkaoui Z. A Survey About Learning-Based Variable Speed Limit Control Strategies: RL, DRL and MARL. Modern Artificial Intelligence and Data Science 2024: Tools, Techniques and Systems, 2024: 565-580.
[13] Chowdhary M A M. Financial Network Infrastructure: Scalability, Security and Optimization, 2025.
[14] Rózycki R, Solarska D A, Waligóra G. Energy-Aware Machine Learning Models-A Review of Recent Techniques and Perspectives. Energies, 2025, 18(11): 2810.
[15] Alhachem C, Kellil M, Bouabdallah A. Complex communication networks management with distributed AI: challenges and open issues, 2025.
[16] Hammad A, Abu-Zaid R. Applications of AI in decentralized computing systems: harnessing artificial intelligence for enhanced scalability, efficiency, and autonomous decision-making in distributed architectures. Applied Research in Artificial Intelligence and Cloud Computing, 2024, 7(6): 161-187.
[17] Ji E, Wang Y, Xing S, et al. Hierarchical Reinforcement Learning for Energy-Efficient API Traffic Optimization in Large-Scale Advertising Systems. IEEE Access, 2025.
[18] Goyal P, Rishiwal V, Negi A. A comprehensive survey on QoS for video transmission in heterogeneous mobile ad hoc network. Transactions on Emerging Telecommunications Technologies, 2023, 34(7): e4775.
[19] Kocot B, Czarnul P, Proficz J. Energy-aware scheduling for high-performance computing systems: A survey. Energies, 2023, 16(2): 890.
[20] Hathwar D K, Bharadwaj S R, Basha S M. Power-Aware Virtualization: Dynamic Voltage Frequency Scaling Insights and Communication-Aware Request Stacking. In Computational Intelligence for Green Cloud Computing and Digital Waste Management ,IGI Global Scientific Publishing, 2024: 84-108.
[21] Lu Yangfan, Chen Caishan, Mei Yuan. Evaluation of the vertical synergy of science and technology financial policies from a structural-functional perspective: Based on the experience of Guangdong Province and cities. Economic Management and Practice, 2025, 3(3): 22-34. DOI: https://doi.org/10.61784/emp2002.
[22] Gures E, Shayea I, Ergen M, et al. Machine learning-based load balancing algorithms in future heterogeneous networks: A survey. IEEE Access, 2022, 10: 37689-37717.
[23] Munikoti S, Agarwal D, Das L, et al. Challenges and opportunities in deep reinforcement learning with graph neural networks: A comprehensive review of algorithms and applications. IEEE transactions on neural networks and learning systems, 2023, 35(11): 15051-15071.
[24] Wang Zhikui. Analysis of the shared energy storage business model for building clusters in commercial pedestrian blocks. Economic Management and Practice, 2025, 3(3): 1-21. https://doi.org/10.61784/emp2001.
[25] Talaat F M. Effective deep Q-networks (EDQN) strategy for resource allocation based on optimized reinforcement learning algorithm. Multimedia Tools and Applications, 2022, 81(28): 39945-39961.
[26] Wang Zhikui. Research on the Elderly-Friendly Design of Subway Ticket Machines Based on FBM Behavioral Model. Modern Engineering and Applications, 2025, 3(3): 1-13. https://doi.org/10.61784/mea2001.
[27] Hutsebaut-Buysse M, Mets K, Latré S. Hierarchical reinforcement learning: A survey and open research challenges. Machine Learning and Knowledge Extraction, 2022, 4(1): 172-221.
[28] Pateria S, Subagdja B, Tan A H, et al. Hierarchical reinforcement learning: A comprehensive survey. ACM Computing Surveys (CSUR), 2021, 54(5): 1-35.
[29] Wang M, Zhang X, Yang Y,et al. Explainable Machine Learning in Risk Management: Balancing Accuracy and Interpretability. Journal of Financial Risk Management,2025, 14(3): 185-198.
[30] Singh O, Rishiwal V, Chaudhry R, et al. Multi-objective optimization in WSN: Opportunities and challenges. Wireless Personal Communications, 2021, 121(1): 127-152.
[31] Cao W, Mai N, Liu W. Adaptive Knowledge Assessment via Symmetric Hierarchical Bayesian Neural Networks with Graph Symmetry-Aware Concept Dependencies. Symmetry, 2025.