Long-Term Time Series Forecasting

Report on Current Developments in Long-Term Time Series Forecasting

General Direction of the Field

The field of Long-Term Time Series Forecasting (LTSF) is witnessing a significant shift towards more efficient, compact, and interpretable models that can handle the complexities of long-range dependencies and computational constraints. Recent advancements are focusing on integrating frequency domain analysis with neural network architectures to capture both short-term fluctuations and long-term trends more effectively. This approach allows for better generalization and adaptability across various time series forecasting tasks, including those with limited computational resources.

One of the key trends is the development of models that leverage multi-scale frequency decomposition and adaptive filtering techniques. These models aim to address the limitations of traditional approaches that often assume stationarity and filter out high-frequency components, which can be crucial for capturing short-term dynamics. By incorporating learnable masks and multi-scale analysis, these new models can adaptively filter out irrelevant components, leading to improved forecasting accuracy.

Another notable trend is the push towards ultra-lightweight models that can operate efficiently on resource-constrained devices. These models, often based on linear or hybrid architectures, reduce computational overhead by employing compact representations in both time and frequency domains. Despite their simplicity, these models demonstrate competitive performance compared to more complex, compute-intensive models, making them suitable for real-world deployment in environments with limited computational capacity.

The integration of Fourier analysis into neural network architectures is also gaining traction. By embedding periodicity directly into the network structure, these models can better understand and predict recurring patterns in time series data. This approach not only enhances the model's ability to capture periodic phenomena but also allows for more efficient parameter usage, leading to more compact and interpretable models.

Overall, the current direction in LTSF is towards more efficient, interpretable, and adaptable models that can handle the complexities of real-world time series data while operating within the constraints of limited computational resources.

Noteworthy Papers

  • MMFNet: Introduces a multi-scale masked frequency decomposition approach that significantly reduces Mean Squared Error (MSE) compared to state-of-the-art models.
  • MixLinear: Achieves state-of-the-art performance with only 0.1K parameters, making it ideal for resource-constrained devices.
  • FAN: Proposes a Fourier Analysis Network that outperforms MLPs in modeling periodic phenomena with fewer parameters and FLOPs.
  • Neural Fourier Modelling (NFM): Achieves state-of-the-art performance across various tasks with highly compact models, requiring fewer than 40K parameters.
  • Esiformer: Reduces MSE and MAE by 6.5% and 5.8%, respectively, in multivariate time series forecasting, outperforming leading models like PatchTST.

Sources

MMFNet: Multi-Scale Frequency Masking Neural Network for Multivariate Time Series Forecasting

MixLinear: Extreme Low Resource Multivariate Time Series Forecasting with 0.1K Parameters

FAN: Fourier Analysis Networks

Neural Fourier Modelling: A Highly Compact Approach to Time-Series Analysis

Research on short-term load forecasting model based on VMD and IPSO-ELM

Less is more: Embracing sparsity and interpolation with Esiformer for time series forecasting

Adaptive Random Fourier Features Training Stabilized By Resampling With Applications in Image Regression

Built with on top of