|
|
|
|
LEADER |
01000caa a22002652 4500 |
001 |
NLM377421707 |
003 |
DE-627 |
005 |
20240916232850.0 |
007 |
cr uuu---uuuuu |
008 |
240911s2024 xx |||||o 00| ||eng c |
024 |
7 |
|
|a 10.1109/TVCG.2024.3456368
|2 doi
|
028 |
5 |
2 |
|a pubmed24n1535.xml
|
035 |
|
|
|a (DE-627)NLM377421707
|
035 |
|
|
|a (NLM)39255119
|
040 |
|
|
|a DE-627
|b ger
|c DE-627
|e rakwb
|
041 |
|
|
|a eng
|
100 |
1 |
|
|a Zhao, Lixiang
|e verfasserin
|4 aut
|
245 |
1 |
0 |
|a SpatialTouch
|b Exploring Spatial Data Visualizations in Cross-reality
|
264 |
|
1 |
|c 2024
|
336 |
|
|
|a Text
|b txt
|2 rdacontent
|
337 |
|
|
|a ƒaComputermedien
|b c
|2 rdamedia
|
338 |
|
|
|a ƒa Online-Ressource
|b cr
|2 rdacarrier
|
500 |
|
|
|a Date Revised 16.09.2024
|
500 |
|
|
|a published: Print-Electronic
|
500 |
|
|
|a Citation Status Publisher
|
520 |
|
|
|a We propose and study a novel cross-reality environment that seamlessly integrates a monoscopic 2D surface (an interactive screen with touch and pen input) with a stereoscopic 3D space (an augmented reality HMD) to jointly host spatial data visualizations. This innovative approach combines the best of two conventional methods of displaying and manipulating spatial 3D data, enabling users to fluidly explore diverse visual forms using tailored interaction techniques. Providing such effective 3D data exploration techniques is pivotal for conveying its intricate spatial structures-often at multiple spatial or semantic scales-across various application domains and requiring diverse visual representations for effective visualization. To understand user reactions to our new environment, we began with an elicitation user study, in which we captured their responses and interactions. We observed that users adapted their interaction approaches based on perceived visual representations, with natural transitions in spatial awareness and actions while navigating across the physical surface. Our findings then informed the development of a design space for spatial data exploration in cross-reality. We thus developed cross-reality environments tailored to three distinct domains: for 3D molecular structure data, for 3D point cloud data, and for 3D anatomical data. In particular, we designed interaction techniques that account for the inherent features of interactions in both spaces, facilitating various forms of interaction, including mid-air gestures, touch interactions, pen interactions, and combinations thereof, to enhance the users' sense of presence and engagement. We assessed the usability of our environment with biologists, focusing on its use for domain research. In addition, we evaluated our interaction transition designs with virtual and mixed-reality experts to gather further insights. As a result, we provide our design suggestions for the cross-reality environment, emphasizing the interaction with diverse visual representations and seamless interaction transitions between 2D and 3D spaces
|
650 |
|
4 |
|a Journal Article
|
700 |
1 |
|
|a Isenberg, Tobias
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Xie, Fuqi
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Liang, Hai-Ning
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Yu, Lingyun
|e verfasserin
|4 aut
|
773 |
0 |
8 |
|i Enthalten in
|t IEEE transactions on visualization and computer graphics
|d 1996
|g PP(2024) vom: 10. Sept.
|w (DE-627)NLM098269445
|x 1941-0506
|7 nnns
|
773 |
1 |
8 |
|g volume:PP
|g year:2024
|g day:10
|g month:09
|
856 |
4 |
0 |
|u http://dx.doi.org/10.1109/TVCG.2024.3456368
|3 Volltext
|
912 |
|
|
|a GBV_USEFLAG_A
|
912 |
|
|
|a SYSFLAG_A
|
912 |
|
|
|a GBV_NLM
|
912 |
|
|
|a GBV_ILN_350
|
951 |
|
|
|a AR
|
952 |
|
|
|d PP
|j 2024
|b 10
|c 09
|