A novel LLM time series forecasting method based on integer-decimal decomposition.

Journal: Scientific reports
Published Date:

Abstract

The use of traditional deep learning models for time series forecasting has demonstrated strong performance in specific domains, but their applicability remains limited due to their domain-specific nature, which restricts generalization. Inspired by advancements in natural language processing (NLP) and computer vision (CV), large language models (LLMs) have emerged as a promising method for time series forecasting. However, fundamental differences between time series data and textual data present challenges in adapting time series for LLM-based forecasting. To address this, we propose an Integer-Decimal Decomposition and cross-modal fine-tuning framework for LLMs (IDDLLM). Our approach designs the Splitting time series Data Cross-attention (SDC) module, which decomposes time series data into integer and decimal components, enabling better correlation analysis and improving the model's understanding of time series patterns. Additionally, we design a dual cross-attention module to align time series modalities and text modalities, facilitating more effective adaptation of time series within LLMs. Comprehensive evaluations demonstrate that IDDLLM outperforms state-of-the-art models in long-term time series forecasting, ranking first in 34 out of 46 experimental settings and second in 9 settings. Furthermore, it achieves competitive performance in few-shot and zero-shot forecasting tasks, highlighting its robustness and adaptability.

Authors

  • Lei Wang
    Department of Nursing, Beijing Hospital, National Center of Gerontology, Institute of Geriatric Medicine, Chinese Academy of Medical Sciences, Beijing, China.
  • Keyao Dong
    School of Management Science and Engineering, Beijing Information Science and Technology University, Beijing, 100192, China. dongkeyao@outlook.com.
  • Xiaoyong Zhao
    School of Management Science and Engineering, Beijing Information Science and Technology University, Beijing, 100192, China.

Keywords

No keywords available for this article.