Unpacking the Gap Box Against Data-Free Knowledge Distillation
Data-free knowledge distillation (DFKD) improves the student model (S) by mimicking the class probability from a pre-trained teacher model (T) without training data. Under such setting, an ideal scenario is that T can help generate "good" samples from a generator (G) to maximally benefit S...
Ausführliche Beschreibung
Bibliographische Detailangaben
Veröffentlicht in: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 46(2024), 9 vom: 20. Aug., Seite 6280-6291
|
1. Verfasser: |
Wang, Yang
(VerfasserIn) |
Weitere Verfasser: |
Qian, Biao,
Liu, Haipeng,
Rui, Yong,
Wang, Meng |
Format: | Online-Aufsatz
|
Sprache: | English |
Veröffentlicht: |
2024
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on pattern analysis and machine intelligence
|
Schlagworte: | Journal Article |