Recurrent Face Aging with Hierarchical AutoRegressive Memory

Modeling the aging process of human faces is important for cross-age face verification and recognition. In this paper, we propose a Recurrent Face Aging (RFA) framework which takes as input a single image and automatically outputs a series of aged faces. The hidden units in the RFA are connected aut...

Description complète

Détails bibliographiques
Publié dans:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 41(2019), 3 vom: 12. März, Seite 654-668
Auteur principal: Wang, Wei (Auteur)
Autres auteurs: Yan, Yan, Cui, Zhen, Feng, Jiashi, Yan, Shuicheng, Sebe, Nicu
Format: Article en ligne
Langue:English
Publié: 2019
Accès à la collection:IEEE transactions on pattern analysis and machine intelligence
Sujets:Journal Article
Description
Résumé:Modeling the aging process of human faces is important for cross-age face verification and recognition. In this paper, we propose a Recurrent Face Aging (RFA) framework which takes as input a single image and automatically outputs a series of aged faces. The hidden units in the RFA are connected autoregressively allowing the framework to age the person by referring to the previous aged faces. Due to the lack of labeled face data of the same person captured in a long range of ages, traditional face aging models split the ages into discrete groups and learn a one-step face transformation for each pair of adjacent age groups. Since human face aging is a smooth progression, it is more appropriate to age the face by going through smooth transitional states. In this way, the intermediate aged faces between the age groups can be generated. Towards this target, we employ a recurrent neural network whose recurrent module is a hierarchical triple-layer gated recurrent unit which functions as an autoencoder. The bottom layer of the module encodes the input to a latent representation, and the top layer decodes the representation to a corresponding aged face. The experimental results demonstrate the effectiveness of our framework
Description:Date Revised 20.11.2019
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1939-3539
DOI:10.1109/TPAMI.2018.2803166