Memorizing Complementation Network for Few-Shot Class-Incremental Learning

Few-shot Class-Incremental Learning (FSCIL) aims at learning new concepts continually with only a few samples, which is prone to suffer the catastrophic forgetting and overfitting problems. The inaccessibility of old classes and the scarcity of the novel samples make it formidable to realize the tra...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - PP(2023) vom: 17. Jan.
1. Verfasser: Ji, Zhong (VerfasserIn)
Weitere Verfasser: Hou, Zhishen, Liu, Xiyao, Pang, Yanwei, Li, Xuelong
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM355265559
003 DE-627
005 20231226064003.0
007 cr uuu---uuuuu
008 231226s2023 xx |||||o 00| ||eng c
024 7 |a 10.1109/TIP.2023.3236160  |2 doi 
028 5 2 |a pubmed24n1184.xml 
035 |a (DE-627)NLM355265559 
035 |a (NLM)37021860 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Ji, Zhong  |e verfasserin  |4 aut 
245 1 0 |a Memorizing Complementation Network for Few-Shot Class-Incremental Learning 
264 1 |c 2023 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 06.04.2023 
500 |a published: Print-Electronic 
500 |a Citation Status Publisher 
520 |a Few-shot Class-Incremental Learning (FSCIL) aims at learning new concepts continually with only a few samples, which is prone to suffer the catastrophic forgetting and overfitting problems. The inaccessibility of old classes and the scarcity of the novel samples make it formidable to realize the trade-off between retaining old knowledge and learning novel concepts. Inspired by that different models memorize different knowledge when learning novel concepts, we propose a Memorizing Complementation Network (MCNet) to ensemble multiple models that complements the different memorized knowledge with each other in novel tasks. Additionally, to update the model with few novel samples, we develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel samples away from not only each other in current task but also the old distribution. Extensive experiments on three benchmark datasets, e.g., CIFAR100, miniImageNet and CUB200, have demonstrated the superiority of our proposed method 
650 4 |a Journal Article 
700 1 |a Hou, Zhishen  |e verfasserin  |4 aut 
700 1 |a Liu, Xiyao  |e verfasserin  |4 aut 
700 1 |a Pang, Yanwei  |e verfasserin  |4 aut 
700 1 |a Li, Xuelong  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on image processing : a publication of the IEEE Signal Processing Society  |d 1992  |g PP(2023) vom: 17. Jan.  |w (DE-627)NLM09821456X  |x 1941-0042  |7 nnns 
773 1 8 |g volume:PP  |g year:2023  |g day:17  |g month:01 
856 4 0 |u http://dx.doi.org/10.1109/TIP.2023.3236160  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d PP  |j 2023  |b 17  |c 01