If you find our work useful in your research. Please consider giving a star ⭐ and citation 📚:
@inproceedings{liu2025timekd,
title={Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation},
author={Chenxi Liu and Hao Miao and Qianxiong Xu and Shaowen Zhou and Cheng Long and Yan Zhao and Ziyue Li and Rui Zhao},
booktitle = {ICDE},
year={2025}
- Python 3.10
- PyTorch 2.1.2
- cuda 12.1
- torchvision 0.16.2
> conda env create -f env_ts.yaml
Datasets can be obtained from TimesNet
chmod +x Store.sh
./Store.sh
chmod +x Fcst.sh
./Fcst.sh
We have conducted experiments on ETTm1, ETTm2, ETTh1, ETTh2, Weather, and Exchange with three different random seeds (i.e., 2024, 6666, 8888) and report the Mean and Standard deviation (Std) of the MSE and MAE in Table A. We observe that the standard deviation of MSE and MAE is minimal ranging from 0.0014 to 0.0042. These results indicate that TimeKD exhibits strong robustness and stability across different random seeds.
For inquiries or further assistance, contact us at chenxi.liu@ntu.edu.sg or open an issue on this repository.