计算机科学
变压器
智能电网
能源消耗
节能
预测建模
网格
高效能源利用
人工智能
机器学习
可靠性工程
工程类
电压
几何学
数学
电气工程
作者
Zhuyi Rao,Yunxiang Zhang
出处
期刊:2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC)
日期:2020-06-01
被引量:3
标识
DOI:10.1109/itoec49072.2020.9141649
摘要
With the rapid development of smart grid construction, building energy consumption prediction is gaining more and more attention in energy planning, management, and conservation. Improving the accuracy of energy consumption judgment is a key factor to ensure efficient operation of the energy system. In addition, the model needs to be able to quickly adapt to changes in energy consumption and respond to various emergencies. This paper thus proposes a modified Transformer model based on Multi-head attention and position encoding mechanism. This model makes Transformer focus on the important information which influence the prediction performance by using self-attention mechanism. Compared with LSTM, it achieves the higher prediction accuracy with less training time by the combination of Transformer and position encoding. The final experiment results demonstrate that the proposed method has a good performance in the electronic load prediction task, and it is also suitable for the application scenarios in periodic change prediction such as forecasting sales and price prediction.
科研通智能强力驱动
Strongly Powered by AbleSci AI