Precueing Object Placement and Orientation for Manual Tasks in Augmented Reality

When a user is performing a manual task, AR or VR can provide information about the current subtask (cueing) and upcoming subtasks (precueing) that makes them easier and faster to complete. Previous research on cueing and precueing in AR and VR has focused on path-following tasks requiring simple ac...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 28(2022), 11 vom: 01. Nov., Seite 3799-3809
1. Verfasser: Liu, Jen-Shuo (VerfasserIn)
Weitere Verfasser: Tversky, Barbara, Feiner, Steven
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2022
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article Research Support, U.S. Gov't, Non-P.H.S.
Beschreibung
Zusammenfassung:When a user is performing a manual task, AR or VR can provide information about the current subtask (cueing) and upcoming subtasks (precueing) that makes them easier and faster to complete. Previous research on cueing and precueing in AR and VR has focused on path-following tasks requiring simple actions at each of a series of locations, such as pushing a button or just visiting. We consider a more complex task, whose subtasks involve moving to and picking up an item, moving that item to a designated place while rotating it to a specific angle, and depositing it. We conducted two user studies to examine how people accomplish this task while wearing an AR headset, guided by different visualizations that cue and precue movement and rotation. Participants performed best when given movement information for two successive subtasks and rotation information for a single subtask. In addition, participants performed best when the rotation visualization was split across the manipulated object and its destination
Beschreibung:Date Completed 31.10.2022
Date Revised 15.11.2022
published: Print-Electronic
Citation Status MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2022.3203111