Parallax360 : Stereoscopic 360° Scene Representation for Head-Motion Parallax

We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 24(2018), 4 vom: 15. Apr., Seite 1545-1553
1. Verfasser: Luo, Bicheng (VerfasserIn)
Weitere Verfasser: Xu, Feng, Richardt, Christian, Yong, Jun-Hai
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2018
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article Research Support, Non-U.S. Gov't
LEADER 01000naa a22002652 4500
001 NLM28199238X
003 DE-627
005 20231225032911.0
007 cr uuu---uuuuu
008 231225s2018 xx |||||o 00| ||eng c
024 7 |a 10.1109/TVCG.2018.2794071  |2 doi 
028 5 2 |a pubmed24n0939.xml 
035 |a (DE-627)NLM28199238X 
035 |a (NLM)29543172 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Luo, Bicheng  |e verfasserin  |4 aut 
245 1 0 |a Parallax360  |b Stereoscopic 360° Scene Representation for Head-Motion Parallax 
264 1 |c 2018 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 20.06.2019 
500 |a Date Revised 20.06.2019 
500 |a published: Print 
500 |a Citation Status MEDLINE 
520 |a We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience - and hence sense of immersion-achieved with our approach compared to widely-used stereoscopic panoramas 
650 4 |a Journal Article 
650 4 |a Research Support, Non-U.S. Gov't 
700 1 |a Xu, Feng  |e verfasserin  |4 aut 
700 1 |a Richardt, Christian  |e verfasserin  |4 aut 
700 1 |a Yong, Jun-Hai  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on visualization and computer graphics  |d 1996  |g 24(2018), 4 vom: 15. Apr., Seite 1545-1553  |w (DE-627)NLM098269445  |x 1941-0506  |7 nnns 
773 1 8 |g volume:24  |g year:2018  |g number:4  |g day:15  |g month:04  |g pages:1545-1553 
856 4 0 |u http://dx.doi.org/10.1109/TVCG.2018.2794071  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 24  |j 2018  |e 4  |b 15  |c 04  |h 1545-1553