Progressive Transfer Learning

Model fine-tuning is a widely used transfer learning approach in person Re-identification (ReID) applications, which fine-tuning a pre-trained feature extraction model into the target scenario instead of training a model from scratch. It is challenging due to the significant variations inside the ta...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 31(2022) vom: 13., Seite 1340-1348
1. Verfasser: Yu, Zhengxu (VerfasserIn)
Weitere Verfasser: Shen, Dong, Jin, Zhongming, Huang, Jianqiang, Cai, Deng, Hua, Xian-Sheng
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2022
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM335606105
003 DE-627
005 20231225230231.0
007 cr uuu---uuuuu
008 231225s2022 xx |||||o 00| ||eng c
024 7 |a 10.1109/TIP.2022.3141258  |2 doi 
028 5 2 |a pubmed24n1118.xml 
035 |a (DE-627)NLM335606105 
035 |a (NLM)35025744 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Yu, Zhengxu  |e verfasserin  |4 aut 
245 1 0 |a Progressive Transfer Learning 
264 1 |c 2022 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 27.01.2022 
500 |a Date Revised 27.01.2022 
500 |a published: Print-Electronic 
500 |a Citation Status MEDLINE 
520 |a Model fine-tuning is a widely used transfer learning approach in person Re-identification (ReID) applications, which fine-tuning a pre-trained feature extraction model into the target scenario instead of training a model from scratch. It is challenging due to the significant variations inside the target scenario, e.g., different camera viewpoint, illumination changes, and occlusion. These variations result in a gap between each mini-batch's distribution and the whole dataset's distribution when using mini-batch training. In this paper, we study model fine-tuning from the perspective of the aggregation and utilization of the dataset's global information when using mini-batch training. Specifically, we introduce a novel network structure called Batch-related Convolutional Cell (BConv-Cell), which progressively collects the dataset's global information into a latent state and uses it to rectify the extracted feature. Based on BConv-Cells, we further proposed the Progressive Transfer Learning (PTL) method to facilitate the model fine-tuning process by jointly optimizing BConv-Cells and the pre-trained ReID model. Empirical experiments show that our proposal can greatly improve the ReID model's performance on MSMT17, Market-1501, CUHK03, and DukeMTMC-reID datasets. Moreover, we extend our proposal to the general image classification task. The experiments in several image classification benchmark datasets demonstrate that our proposal can significantly improve baseline models' performance. The code has been released at https://github.com/ZJULearning/PTL 
650 4 |a Journal Article 
700 1 |a Shen, Dong  |e verfasserin  |4 aut 
700 1 |a Jin, Zhongming  |e verfasserin  |4 aut 
700 1 |a Huang, Jianqiang  |e verfasserin  |4 aut 
700 1 |a Cai, Deng  |e verfasserin  |4 aut 
700 1 |a Hua, Xian-Sheng  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on image processing : a publication of the IEEE Signal Processing Society  |d 1992  |g 31(2022) vom: 13., Seite 1340-1348  |w (DE-627)NLM09821456X  |x 1941-0042  |7 nnns 
773 1 8 |g volume:31  |g year:2022  |g day:13  |g pages:1340-1348 
856 4 0 |u http://dx.doi.org/10.1109/TIP.2022.3141258  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 31  |j 2022  |b 13  |h 1340-1348