HeadDiff : Exploring Rotation Uncertainty With Diffusion Models for Head Pose Estimation

In this paper, we propose a probabilistic regression diffusion model for head pose estimation, dubbed HeadDiff, which typically addresses the rotation uncertainty, especially when faces are captured in wild conditions. Unlike conventional image-to-pose methods which cannot explicitly establish the r...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 33(2024) vom: 13., Seite 1868-1882
1. Verfasser: Wang, Yaoxing (VerfasserIn)
Weitere Verfasser: Liu, Hao, Feng, Yaowei, Li, Zhendong, Wu, Xiangjuan, Zhu, Congcong
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:In this paper, we propose a probabilistic regression diffusion model for head pose estimation, dubbed HeadDiff, which typically addresses the rotation uncertainty, especially when faces are captured in wild conditions. Unlike conventional image-to-pose methods which cannot explicitly establish the rotational manifold of head poses, our HeadDiff aims to ensure the pose rotation via the diffusion process and in parallel, refine the mapping process iteratively. Specifically, we initially formulate the head pose estimation problem as a reverse diffusion process, defining a paradigm for progressive denoising on the manifold, which explores the uncertainty by decomposing the large gap into intermediate steps. Moreover, our HeadDiff is equipped with an isotropic Gaussian distribution by encoding the incoherence information in our rotation representation. Finally, we learn the facial relationship of nearest neighbors with a cycle-consistent constraint for robust pose estimation versus diverse shape variations. Experimental results on multiple datasets demonstrate that our proposed method outperforms existing state-of-the-art techniques without auxiliary data
Beschreibung:Date Revised 13.03.2024
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2024.3372457