8000 GitHub - ChenxiLiu-HNU/TimeKD: [ICDE 2025] Official implementation of "Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation"
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

[ICDE 2025] Official implementation of "Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation"

License

Notifications You must be signed in to change notification settings

ChenxiLiu-HNU/TimeKD

Repository files navigation

(ICDE'25) Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation

If you find our work useful in your research. Please consider giving a star ⭐ and citation 📚:

@inproceedings{liu2025timekd,
  title={Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation},
  author={Chenxi Liu and Hao Miao and Qianxiong Xu and Shaowen Zhou and Cheng Long and Yan Zhao and Ziyue Li and Rui Zhao},
  booktitle    = {ICDE},
  year={2025}

Dependencies

  • Python 3.10
  • PyTorch 2.1.2
  • cuda 12.1
  • torchvision 0.16.2
> conda env create -f env_ts.yaml

Datasets

Datasets can be obtained from TimesNet

Usages

  • Storage

chmod +x Store.sh
./Store.sh
  • Forecasting

chmod +x Fcst.sh
./Fcst.sh
  • Standard deviation of the MAE and RMSE

image

We have conducted experiments on ETTm1, ETTm2, ETTh1, ETTh2, Weather, and Exchange with three different random seeds (i.e., 2024, 6666, 8888) and report the Mean and Standard deviation (Std) of the MSE and MAE in Table A. We observe that the standard deviation of MSE and MAE is minimal ranging from 0.0014 to 0.0042. These results indicate that TimeKD exhibits strong robustness and stability across different random seeds.

Contact Us

For inquiries or further assistance, contact us at chenxi.liu@ntu.edu.sg or open an issue on this repository.

About

[ICDE 2025] Official implementation of "Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0