强化学习
马尔可夫决策过程
光伏系统
计算机科学
调度(生产过程)
电
需求响应
能源管理
增强学习
能源消耗
马尔可夫过程
实时计算
数学优化
人工智能
工程类
能量(信号处理)
运营管理
电气工程
统计
数学
作者
Xu Xu,Youwei Jia,Yan Xu,Zhao Xu,Songjian Chai,Chun Sing Lai
出处
期刊:IEEE Transactions on Smart Grid
[Institute of Electrical and Electronics Engineers]
日期:2020-07-01
卷期号:11 (4): 3201-3211
被引量:218
标识
DOI:10.1109/tsg.2020.2971427
摘要
This paper proposes a novel framework for home energy management (HEM) based on reinforcement learning in achieving efficient home-based demand response (DR). The concerned hour-ahead energy consumption scheduling problem is duly formulated as a finite Markov decision process (FMDP) with discrete time steps. To tackle this problem, a data-driven method based on neural network (NN) and Q-learning algorithm is developed, which achieves superior performance on cost-effective schedules for HEM system. Specifically, real data of electricity price and solar photovoltaic (PV) generation are timely processed for uncertainty prediction by extreme learning machine (ELM) in the rolling time windows. The scheduling decisions of the household appliances and electric vehicles (EVs) can be subsequently obtained through the newly developed framework, of which the objective is dual, i.e., to minimize the electricity bill as well as the DR induced dissatisfaction. Simulations are performed on a residential house level with multiple home appliances, an EV and several PV panels. The test results demonstrate the effectiveness of the proposed data-driven based HEM framework.
科研通智能强力驱动
Strongly Powered by AbleSci AI