Learning Dynamic Relationships for Facial Expression Recognition Based on Graph Convolutional Network
Facial action units (AUs) analysis plays an important role in facial expression recognition (FER). Existing deep spectral convolutional networks (DSCNs) have made encouraging performance for FER based on a set of facial local regions and a predefined graph structure. However, these regions do not ha...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 30(2021) vom: 03., Seite 7143-7155 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2021
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Facial action units (AUs) analysis plays an important role in facial expression recognition (FER). Existing deep spectral convolutional networks (DSCNs) have made encouraging performance for FER based on a set of facial local regions and a predefined graph structure. However, these regions do not have close relationships to AUs, and DSCNs cannot model the dynamic spatial dependencies of these regions for estimating different facial expressions. To tackle these issues, we propose a novel double dynamic relationships graph convolutional network (DDRGCN) to learn the strength of the edges in the facial graph by a trainable weighted adjacency matrix. We construct facial graph data by 20 regions of interest (ROIs) guided by different facial AUs. Furthermore, we devise an efficient graph convolutional network in which the inherent dependencies of vertices in the facial graph can be learned automatically during network training. Notably, the proposed model only has 110K parameters and 0.48MB model size, which is significantly less than most existing methods. Experiments on four widely-used FER datasets demonstrate that the proposed dynamic relationships graph network achieves superior results compared to existing light-weight networks, not just in terms of accuracy but also model size and speed |
---|---|
Beschreibung: | Date Completed 13.08.2021 Date Revised 13.08.2021 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2021.3101820 |