Mutual Component Convolutional Neural Networks for Heterogeneous Face Recognition
HHeterogeneous face recognition (HFR) aims to identify a person from different facial modalities such as visible and near-infrared images. The main challenges of HFR lie in the large modality discrepancy and insufficient training samples. In this paper, we propose the Mutual Component Convolutional...
Publié dans: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - (2019) vom: 23. Jan. |
---|---|
Auteur principal: | |
Autres auteurs: | , , |
Format: | Article en ligne |
Langue: | English |
Publié: |
2019
|
Accès à la collection: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Sujets: | Journal Article |
Résumé: | HHeterogeneous face recognition (HFR) aims to identify a person from different facial modalities such as visible and near-infrared images. The main challenges of HFR lie in the large modality discrepancy and insufficient training samples. In this paper, we propose the Mutual Component Convolutional Neural Network (MC-CNN), a modal-invariant deep learning framework, to tackle these two issues simultaneously. Our MCCNN incorporates a generative module, i.e. the Mutual Component Analysis (MCA) [1], into modern deep convolutional neural networks by viewing MCA as a special fully-connected (FC) layer. Based on deep features, this FC layer is designed to extract modal-independent hidden factors, and is updated according to maximum likelihood analytic formulation instead of back propagation which prevents over-fitting from limited data naturally. In addition, we develop an MCA loss to update the network for modal-invariant feature learning. Extensive experiments show that our MC-CNN outperforms several finetuned baseline models significantly. Our methods achieve the state-of-the-art performance on CASIA NIR-VIS 2.0, CUHK NIR-VIS and IIIT-D Sketch dataset |
---|---|
Description: | Date Revised 27.02.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2019.2894272 |