The field of time series forecasting is rapidly advancing with a focus on developing innovative models that can accurately capture complex temporal variability and sharp fluctuations. Recent research has seen a shift towards leveraging frequency-domain decomposition, attention mechanisms, and 2D transformations to improve forecasting accuracy and robustness. Notably, the use of neural networks and deep learning frameworks has become increasingly popular, with many models achieving state-of-the-art performance on various benchmarks. The development of novel frameworks and architectures has enabled the effective capture of long-term dependencies, seasonal patterns, and non-linear relationships in time series data.
Some noteworthy papers include: The SPDNet paper, which proposes a novel deep learning framework for residential demand forecasting that outperforms traditional and advanced models in both forecasting accuracy and computational efficiency. The Times2D paper, which introduces a method that transforms 1D time series into 2D space, enabling the utilization of 2D convolutional operations to capture long and short characteristics of the time series, achieving state-of-the-art performance in both short-term and long-term forecasting. The Attention Mamba paper, which introduces an innovative framework featuring a novel Adaptive Pooling block that accelerates attention computation and incorporates global information, effectively overcoming the constraints of limited receptive fields. The Learning Phase Distortion paper, which presents a new turbulence mitigation method based on the Selective State Space Model and Learned Latent Phase Distortion, exceeding current state-of-the-art networks on various synthetic and real-world benchmarks with significantly faster inference speed.