The field of temporal knowledge graph forecasting is witnessing significant developments with the integration of large language models (LLMs). Researchers are exploring innovative ways to address the limitations of LLMs in this domain, such as limited input length and inefficient output generation. Recent studies have proposed frameworks that leverage traditional temporal knowledge graph models as adapters to refine LLM outputs, resulting in improved forecasting performance and robust generalization capabilities. Another area of focus is the application of LLMs in cryptocurrency price prediction, where adapting pre-trained models to the unique characteristics of time series data has shown promising results. The development of new benchmarks and evaluation metrics, such as those that assess inferability through causal inference, is also underway to support the advancement of this field. Noteworthy papers include:
- Ignite Forecasting with SPARK, which introduces a sequence-level proxy-adapting framework for refining LLMs in temporal knowledge graph forecasting.
- Ethereum Price Prediction Employing Large Language Models, which demonstrates the effectiveness of adapting pre-trained LLMs for short-term and few-shot forecasting of Ethereum prices.
- PROPHET, which proposes a new benchmark for inferable future forecasting with causal intervened likelihood estimation.
- Efficient Model Selection for Time Series Forecasting via LLMs, which leverages LLMs for lightweight model selection, eliminating the need for explicit performance matrices.