变压器
期限(时间)
计算机科学
可靠性工程
人工智能
工程类
电气工程
电压
物理
量子力学
作者
Ahmad Ahmad,Xun Xiao,Huadong Mo,Daoyi Dong
标识
DOI:10.1016/j.ijepes.2025.110549
摘要
Electrical load forecasting is essential for the efficient operation and planning of power systems. Recent studies have employed Transformer models in forecasting due to their unique attention mechanisms and ability to extract correlations in data. However, these models face challenges in integrating varied data types and capturing long-term dependencies. To address these limitations, this study proposes a TFTformer, a transformer-based neural network designed to enhance the accuracy and generalisability of load forecasting models. The TFTformer incorporates transposed feature-specific embeddings for weather, time, and load data to more accurately capture their unique characteristics. A linear transformation layer post embedding improves feature representation, aligning and standardising features across sequences for improved pattern recognition. Additionally, a Temporal Convolutional Network is integrated within the Transformer’s encoder, employing causal convolutions and dilation to adapt to the sequential nature of data with an expanded receptive field. The effectiveness of the TFTformer is demonstrated through a comparative study against several state-of-the-art methods using load datasets from Belgium, New Zealand, and five Australian states. The results demonstrate that the TFTformer achieves significant MSE improvements across different locations, with over 50% improvement over most models, 42% over CARD, and 16%–17% improvement compared to iFlowformer and iReformer. Furthermore, an Analysis of Variance is conducted to evaluate the impact of each component of the TFTformer. A SHAP-based interpretability analysis, using surrogate models, is conducted to elucidate the decision-making process of TFTformer, highlighting the critical role of time factors and weather features in its predictions.
科研通智能强力驱动
Strongly Powered by AbleSci AI