Awesome Knowledge Distillation
-
Updated
Nov 27, 2024
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Pytorch implementation of various Knowledge Distillation (KD) methods.
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
模型压缩的小白入门教程
Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.
Matching Guided Distillation (ECCV 2020)
Awesome-3D/Multimodal-Anomaly-Detection-and-Localization/Segmentation/3D-KD/3D-knowledge-distillation
[ECCV2022] Factorizing Knowledge in Neural Networks
Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.
Rotated Localization Distillation (CVPR 2022, TPAMI 2023)
This is a Kotlin implementation of the KD language. It is feature complete and passes all tests.
Pluto notebook for curve fitting
Add a description, image, and links to the kd topic page so that developers can more easily learn about it.
To associate your repository with the kd topic, visit your repo's landing page and select "manage topics."