计算机科学
杠杆(统计)
人工智能
背景(考古学)
机器学习
时间序列
系列(地层学)
语言模型
模式
地理
社会科学
生物
社会学
古生物学
考古
作者
Ming Jin,Shiyu Wang,Lintao Ma,Zhixuan Chu,James Y. Zhang,Xiaoming Shi,Pin‐Yu Chen,Yuxuan Liang,Yuan-Fang Li,Shirui Pan,Qingsong Wen
出处
期刊:Cornell University - arXiv
日期:2023-01-01
被引量:16
标识
DOI:10.48550/arxiv.2310.01728
摘要
Time series forecasting holds significant importance in many real-world dynamic systems and has been extensively studied. Unlike natural language process (NLP) and computer vision (CV), where a single large model can tackle multiple tasks, models for time series forecasting are often specialized, necessitating distinct designs for different tasks and applications. While pre-trained foundation models have made impressive strides in NLP and CV, their development in time series domains has been constrained by data sparsity. Recent studies have revealed that large language models (LLMs) possess robust pattern recognition and reasoning abilities over complex sequences of tokens. However, the challenge remains in effectively aligning the modalities of time series data and natural language to leverage these capabilities. In this work, we present Time-LLM, a reprogramming framework to repurpose LLMs for general time series forecasting with the backbone language models kept intact. We begin by reprogramming the input time series with text prototypes before feeding it into the frozen LLM to align the two modalities. To augment the LLM's ability to reason with time series data, we propose Prompt-as-Prefix (PaP), which enriches the input context and directs the transformation of reprogrammed input patches. The transformed time series patches from the LLM are finally projected to obtain the forecasts. Our comprehensive evaluations demonstrate that Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models. Moreover, Time-LLM excels in both few-shot and zero-shot learning scenarios.
科研通智能强力驱动
Strongly Powered by AbleSci AI