Mutual Component Convolutional Neural Networks for Heterogeneous Face Recognition
HHeterogeneous face recognition (HFR) aims to identify a person from different facial modalities such as visible and near-infrared images. The main challenges of HFR lie in the large modality discrepancy and insufficient training samples. In this paper, we propose the Mutual Component Convolutional...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - (2019) vom: 23. Jan. |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2019
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | HHeterogeneous face recognition (HFR) aims to identify a person from different facial modalities such as visible and near-infrared images. The main challenges of HFR lie in the large modality discrepancy and insufficient training samples. In this paper, we propose the Mutual Component Convolutional Neural Network (MC-CNN), a modal-invariant deep learning framework, to tackle these two issues simultaneously. Our MCCNN incorporates a generative module, i.e. the Mutual Component Analysis (MCA) [1], into modern deep convolutional neural networks by viewing MCA as a special fully-connected (FC) layer. Based on deep features, this FC layer is designed to extract modal-independent hidden factors, and is updated according to maximum likelihood analytic formulation instead of back propagation which prevents over-fitting from limited data naturally. In addition, we develop an MCA loss to update the network for modal-invariant feature learning. Extensive experiments show that our MC-CNN outperforms several finetuned baseline models significantly. Our methods achieve the state-of-the-art performance on CASIA NIR-VIS 2.0, CUHK NIR-VIS and IIIT-D Sketch dataset |
---|---|
Beschreibung: | Date Revised 27.02.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2019.2894272 |