期限(时间)
地理
环境科学
统计
数学
物理
量子力学
作者
Sicheng He,J. Ji,Min Lei
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2025-04-11
卷期号:39 (11): 11772-11780
标识
DOI:10.1609/aaai.v39i11.33281
摘要
Traffic prediction provides vital support for urban traffic management and has received extensive research interest. By virtue of the ability to effectively learn spatial and temporal dependencies from a global view, Transformers have achieved superior performance in long-term traffic prediction. However, existing methods usually underrate the complex spatio-temporal entanglement in long-range sequences. Compared with purely temporal entanglement, spatio-temporal data emphasizes the entangled dynamics under the restrictions of traffic networks, which brings additional difficulties. Moreover, the computational costs of spatio-temporal Transformers scale quadratically as the sequence length grows, limiting their applications on long-range and large-scale scenarios. To address these problems, we propose a decomposed spatio-temporal Mamba (DST-Mamba) for traffic prediction. We aim to apply temporal decomposition to the entangled sequences and obtain the seasonal and trend parts. Shifting from the temporal view to the spatial view, we leverage Mamba, a state space model with near-linear complexity, to capture seasonal variations in a node-centric manner. Meanwhile, multi-scale trend information is extracted and aggregated by simple linear layers. Such combination equips DST-Mamba with superior capability to model long-range spatio-temporal dependencies while remaining efficient compared with Transformers. Experimental results across five real-world datasets demonstrate that DST-Mamba can capture both local fluctuations and global trends within traffic patterns, achieving state-of-the-art performance with favorable efficiency.
科研通智能强力驱动
Strongly Powered by AbleSci AI