强化学习
计算机科学
云计算
调度(生产过程)
分布式计算
边缘设备
人工智能
操作系统
数学优化
数学
作者
Zhiyang Zhang,Fengli Zhang,Zehui Xiong,Kuan Zhang,Dajiang Chen
标识
DOI:10.1109/jiot.2024.3386888
摘要
Task scheduling in large-scale industrial Internet of Things (IIoT) is characterized by the presence of diverse resources and the requirement for efficient and synchronized processing across distributed edge clouds, raising a significant challenge. This paper proposes a task scheduling framework across edge clouds, namely LsiA3CS, which employs deep reinforcement learning (DRL) and heuristic guidance to achieve distributed, asynchronous task scheduling for large-scale IIoT. Specifically, the Markov game-based model and the asynchronous advantage actor-critic (A3C) algorithm are leveraged to orchestrate diverse computational resources, effectively balancing workloads and reducing communication latency. Moreover, the incorporation of heuristic policy annealing and action masking techniques further refines the adaptability of the proposed framework to the unpredictable requirements of large-scale IIoT systems. Real-world task datasets are utilized to conduct extensive experimental evaluations on a simulated large-scale multi-edge cloud IIoT. The results shows that LsiA3CS significantly reduces task completion times and energy consumption while managing unpredictable task arrivals and variable resource capacities.
科研通智能强力驱动
Strongly Powered by AbleSci AI