计算机科学
循环神经网络
杠杆(统计)
人工智能
深度学习
特征学习
依赖关系(UML)
机器学习
变压器
卷积神经网络
短时记忆
边距(机器学习)
时间序列
图形
序列学习
人工神经网络
理论计算机科学
工程类
电压
电气工程
作者
Ling Cai,Krzysztof Janowicz,Gengchen Mai,Bo Yan,Rui Zhu
摘要
Abstract Traffic forecasting is a challenging problem due to the complexity of jointly modeling spatio‐temporal dependencies at different scales. Recently, several hybrid deep learning models have been developed to capture such dependencies. These approaches typically utilize convolutional neural networks or graph neural networks (GNNs) to model spatial dependency and leverage recurrent neural networks (RNNs) to learn temporal dependency. However, RNNs are only able to capture sequential information in the time series, while being incapable of modeling their periodicity (e.g., weekly patterns). Moreover, RNNs are difficult to parallelize, making training and prediction less efficient. In this work we propose a novel deep learning architecture called Traffic Transformer to capture the continuity and periodicity of time series and to model spatial dependency. Our work takes inspiration from Google’s Transformer framework for machine translation. We conduct extensive experiments on two real‐world traffic data sets, and the results demonstrate that our model outperforms baseline models by a substantial margin.
科研通智能强力驱动
Strongly Powered by AbleSci AI