Dynamic Support Network for Few-Shot Class Incremental Learning

Few-shot class-incremental learning (FSCIL) is challenged by catastrophically forgetting old classes and over-fitting new classes. Revealed by our analyses, the problems are caused by feature distribution crumbling, which leads to class confusion when continuously embedding few samples to a fixed fe...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 45(2023), 3 vom: 19. März, Seite 2945-2951
1. Verfasser: Yang, Boyu (VerfasserIn)
Weitere Verfasser: Lin, Mingbao, Zhang, Yunxiao, Liu, Binghao, Liang, Xiaodan, Ji, Rongrong, Ye, Qixiang
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM341111570
003 DE-627
005 20231226011023.0
007 cr uuu---uuuuu
008 231226s2023 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2022.3175849  |2 doi 
028 5 2 |a pubmed24n1136.xml 
035 |a (DE-627)NLM341111570 
035 |a (NLM)35588416 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Yang, Boyu  |e verfasserin  |4 aut 
245 1 0 |a Dynamic Support Network for Few-Shot Class Incremental Learning 
264 1 |c 2023 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 07.04.2023 
500 |a Date Revised 07.04.2023 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a Few-shot class-incremental learning (FSCIL) is challenged by catastrophically forgetting old classes and over-fitting new classes. Revealed by our analyses, the problems are caused by feature distribution crumbling, which leads to class confusion when continuously embedding few samples to a fixed feature space. In this study, we propose a Dynamic Support Network (DSN), which refers to an adaptively updating network with compressive node expansion to "support" the feature space. In each training session, DSN tentatively expands network nodes to enlarge feature representation capacity for incremental classes. It then dynamically compresses the expanded network by node self-activation to pursue compact feature representation, which alleviates over-fitting. Simultaneously, DSN selectively recalls old class distributions during incremental learning to support feature distributions and avoid confusion between classes. DSN with compressive node expansion and class distribution recalling provides a systematic solution for the problems of catastrophic forgetting and overfitting. Experiments on CUB, CIFAR-100, and miniImage datasets show that DSN significantly improves upon the baseline approach, achieving new state-of-the-arts 
650 4 |a Journal Article 
700 1 |a Lin, Mingbao  |e verfasserin  |4 aut 
700 1 |a Zhang, Yunxiao  |e verfasserin  |4 aut 
700 1 |a Liu, Binghao  |e verfasserin  |4 aut 
700 1 |a Liang, Xiaodan  |e verfasserin  |4 aut 
700 1 |a Ji, Rongrong  |e verfasserin  |4 aut 
700 1 |a Ye, Qixiang  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g 45(2023), 3 vom: 19. März, Seite 2945-2951  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnns 
773 1 8 |g volume:45  |g year:2023  |g number:3  |g day:19  |g month:03  |g pages:2945-2951 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2022.3175849  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 45  |j 2023  |e 3  |b 19  |c 03  |h 2945-2951