计算机科学
编码器
图形
卷积(计算机科学)
深度学习
数据挖掘
循环神经网络
空间分析
自编码
人工神经网络
注意力网络
保险丝(电气)
人工智能
理论计算机科学
地质学
工程类
电气工程
操作系统
遥感
作者
Yanjun Qin,Fang Zhao,Yuchen Fang,Haiyong Luo,Chenxing Wang
摘要
In recent years, traffic forecasting has gradually attracted attention in data mining because of the increasing availability of large-scale traffic data. However, it faces substantial challenges of complex temporal-spatial correlations in traffic. Recent studies mainly focus on modeling the local spatial correlations by utilizing graph neural networks and neglect the influence of long-distance spatial correlations. Besides, most existing works utilize recurrent neural networks-based encoder–decoder architecture to forecast multistep traffic volume and suffer from accumulative errors in recurrent neural networks. To deal with these issues, we propose the memory attention (MA) enhanced graph convolution long short-term memory network (MAEGCLSTM), a novel deep learning model for traffic forecasting. Specifically, MAEGCLSTM combines the MA and the vanilla graph convolution long short-term memory to capture global and local spatio-temporal dependencies, respectively. Then MAEGCLSTM utilizes a simplified GCLSTM to effectively fuse the global and local information. Moreover, we integrate the MAEGCLSTM into an encoder–decoder architecture to forecast multistep traffic volume. Besides MAEGCLSTM, we add the convolution neural network and encoder–decoder attention into the decoder to ease accumulative errors caused by iterative prediction and gain whole historical information from the encoder. Experiments on four real-world traffic data sets show that our model significantly outperforms by up to 6.07 % $6.07 \% $ improvement in L 1 $L1$ measure over 14 baselines.
科研通智能强力驱动
Strongly Powered by AbleSci AI