Transferring boosted detectors towards viewpoint and scene adaptiveness
In object detection, disparities in distributions between the training samples and the test ones are often inevitable, resulting in degraded performance for application scenarios. In this paper, we focus on the disparities caused by viewpoint and scene changes and propose an efficient solution to th...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 20(2011), 5 vom: 26. Mai, Seite 1388-400 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2011
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article Research Support, Non-U.S. Gov't |
Zusammenfassung: | In object detection, disparities in distributions between the training samples and the test ones are often inevitable, resulting in degraded performance for application scenarios. In this paper, we focus on the disparities caused by viewpoint and scene changes and propose an efficient solution to these particular cases by adapting generic detectors, assuming boosting style. A pretrained boosting-style detector encodes a priori knowledge in the form of selected features and weak classifier weighting. Towards adaptiveness, the selected features are shifted to the most discriminative locations and scales to compensate for the possible appearance variations. Moreover, the weighting coefficients are further adapted with covariate boost, which maximally utilizes the related training data to enrich the limited new examples. Extensive experiments validate the proposed adaptation mechanism towards viewpoint and scene adaptiveness and show encouraging improvement on detection accuracy over state-of-the-art methods |
---|---|
Beschreibung: | Date Completed 19.08.2011 Date Revised 21.04.2011 published: Print-Electronic Citation Status MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2010.2103951 |