Embedding Visual Hierarchy with Deep Networks for Large-Scale Visual Recognition
In this paper, a layer-wise mixture model (LMM) is developed to support hierarchical visual recognition, where a Bayesian approach is used to automatically adapt the visual hierarchy to the progressive improvements of the deep network along the time. Our LMM algorithm can provide an end-to-end appro...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - (2018) vom: 07. Juni |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2018
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | In this paper, a layer-wise mixture model (LMM) is developed to support hierarchical visual recognition, where a Bayesian approach is used to automatically adapt the visual hierarchy to the progressive improvements of the deep network along the time. Our LMM algorithm can provide an end-to-end approach for jointly learning: (a) the deep network for achieving more discriminative deep representations for object classes and their inter-class visual similarities; (b) the tree classifier for recognizing large numbers of object classes hierarchically; and (c) the visual hierarchy adaptation for achieving more accurate assignment and organization of large numbers of object classes. By learning the tree classifier, the deep network and the visual hierarchy adaptation jointly in an end-to-end manner, our LMM algorithm can achieve higher accuracy rates on hierarchical visual recognition. Our experiments are carried on ImageNet1K and ImageNet10K image sets, which have demonstrated that our LMM algorithm can achieve very competitive results on the accuracy rates as compared with the baseline methods |
---|---|
Beschreibung: | Date Revised 27.02.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2018.2845118 |