计算机科学
变压器
计算
数据挖掘
时间序列
人工智能
深度学习
时态数据库
实时计算
流量(计算机网络)
人工神经网络
机器学习
算法
工程类
电压
计算机安全
电气工程
作者
Junhao Zhang,Junjie Tang,Jun-Cheng Jin,Zehui Qu
标识
DOI:10.1109/ijcnn54540.2023.10191072
摘要
Traffic flow forecasting plays a vital role in Intelligent Transportation Systems (ITS). Accurate traffic flow forecasting is challenging due to the intricate spatio-temporal correlations in traffic data. Recently GNN-based and Transformer-based methods significantly improved the prediction accuracy. However, existing GNN-based methods struggle to capture long-range dependencies. Furthermore, existing Transformer-based methods treat traffic flow data as time series and extract temporal and spatial relationships separately. That will make the input series difficult to be processed by the Transformer without information loss and increase the computation time. To address these issues, we propose a novel framework: Spatio-Temporal Pre-training enhanced Fast Pure Transformer Network (STP-FPTN). First, the traffic flow data are split along the dimension of the sensor rather than the time dimension. After that, a pure Transformer-based model is designed for capturing complex and long-range spatio-temporal correlations simultaneously and quickly. Then, to enhance the ability of our model to extract spatio-temporal features, STP-FPTN utilizes unsupervised pre-training for intricate spatio-temporal patterns representation learning. Extensive experiments are conducted with 4 real-world datasets and 19 baselines, which demonstrate that our framework outperforms the state-of-the-art methods. Meanwhile, STP-FPTN runs 67 times faster than the state-of-the-art method, and the requirements for computing resources are significantly reduced.
科研通智能强力驱动
Strongly Powered by AbleSci AI