SGaze : A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction

We present a novel, data-driven eye-head coordination model that can be used for realtime gaze prediction for immersive HMD-based applications without any external hardware or eye tracker. Our model (SGaze) is computed by generating a large dataset that corresponds to different users navigating in v...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 25(2019), 5 vom: 18. Mai, Seite 2002-2010
1. Verfasser: Hu, Zhiming (VerfasserIn)
Weitere Verfasser: Zhang, Congyi, Li, Sheng, Wang, Guoping, Manocha, Dinesh
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2019
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article Research Support, Non-U.S. Gov't
Beschreibung
Zusammenfassung:We present a novel, data-driven eye-head coordination model that can be used for realtime gaze prediction for immersive HMD-based applications without any external hardware or eye tracker. Our model (SGaze) is computed by generating a large dataset that corresponds to different users navigating in virtual worlds with different lighting conditions. We perform statistical analysis on the recorded data and observe a linear correlation between gaze positions and head rotation angular velocities. We also find that there exists a latency between eye movements and head movements. SGaze can work as a software-based realtime gaze predictor and we formulate a time related function between head movement and eye movement and use that for realtime gaze position prediction. We demonstrate the benefits of SGaze for gaze-contingent rendering and evaluate the results with a user study
Beschreibung:Date Completed 10.02.2020
Date Revised 10.02.2020
published: Print-Electronic
Citation Status MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2019.2899187