Subhaditya's KB

Home

❯

KB

❯

Knowledge Distillation

Knowledge Distillation

Sep 18, 20241 min read

  • temp

Knowledge Distillation

  • Teacher model to help train the student model
  • Teacher is often pre trained
  • Student tries to imitate teacher
  • Distillation Loss
  • Knowledge Distillation Survey 2021
  • Distilling the Knowledge in a Neural Network

Graph View

Backlinks

  • AutoDistill
  • DeiT
  • DistillBERT
  • TinyBERT
  • _Index_of_KB

Created with Quartz v4.3.1 © 2025

  • GitHub