Awesome Knowledge-Distillation

屈晨
2023-12-01

Different forms of knowledge

Self-Knowledge Distillation

[ICCV 2019] Be Your Own Teacher: Improve the Performance of CNN via Self Distillation

[CVPR 2020] Regularizing Class-wise Predictions via Self-knowledge Distillation

[CVPR 2021] Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation

KD for ViTs

[ICML 2021] Training data-efficient image transformers & distillation through attention

References

 类似资料:

相关阅读

相关文章

相关问答