Restructuring the Teacher and Student in Self-Distillation
Knowledge distillation aims to achieve model compression by transferring knowledge from complex teacher models to lightweight student models. To reduce reliance on pre-trained teacher models, self-distillation methods utilize knowledge from the model itself as additional supervision. However, their...
Ausführliche Beschreibung
Bibliographische Detailangaben
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 33(2024) vom: 23., Seite 5551-5563
|
1. Verfasser: |
Zheng, Yujie
(VerfasserIn) |
Weitere Verfasser: |
Wang, Chong,
Tao, Chenchen,
Lin, Sunqi,
Qian, Jiangbo,
Wu, Jiafei |
Format: | Online-Aufsatz
|
Sprache: | English |
Veröffentlicht: |
2024
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
|
Schlagworte: | Journal Article |