Highlights
- Pro
Stars
A curated list of early exiting (LLM, CV, NLP, etc)
Implementations, Pre-training Code and Datasets of Large Time-Series Models
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
[VLDB' 25] ChatTS: Understanding, Chat, Reasoning about Time Series with TS-MLLM
The open-source Mixture of Depths code and the official implementation of the paper "Router-Tuning: A Simple and Effective Approach for Enabling Dynamic Depth in Transformers."
AdaSkip: Adaptive Sublayer Skipping for Accelerating Long-Context LLM Inference
A novel time series forecasting method, Times2D, transforms 1D data into 2D space using multi-period decomposition and derivative heatmaps to achieve state-of-the-art performance.
PyTorch implementation of "ChatTime: A Unified Multimodal Time Series Foundation Model Bridging Numerical and Textual Data" (AAAI 2025 [oral])
AAAI 2024 Papers: Explore a comprehensive collection of innovative research papers presented at one of the premier artificial intelligence conferences. Seamlessly integrate code implementations for…
[ICLR 2025 Spotlight] Official implementation of "Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts"
A Library for Advanced Deep Time Series Models.
The official code for "One Fits All: Power General Time Series Analysis by Pretrained LM (NeurIPS 2023 Spotlight)"
MTS-AD / LLM4TS
Forked from DAMO-DI-ML/NeurIPS2023-One-Fits-AllThe official code for "One Fits All: Power General Time Series Analysis by Pretrained LM (NeurIPS 2023 Spotlight)"
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
Natural Language Processing Tutorial for Deep Learning Researchers
Google AI 2018 BERT pytorch implementation
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Monocular Depth Estimation Using Laplacian Pyramid-Based Depth Residuals
🧑🚀 全世界最好的LLM资料总结(视频生成、Agent、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
Code for 'RRPN: Radar Region Proposal Network for Object Detection in Autonomous Vehicles' (ICIP 2019)
Code for "NeuralRecon: Real-Time Coherent 3D Reconstruction from Monocular Video", CVPR 2021 oral
Ultra Fast Structure-aware Deep Lane Detection (ECCV 2020)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.