Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Apr 24, 2025
Abstract
Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query-key (Q-K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.