Adaptive Light Estimation using Dynamic Filtering for Diverse Lighting Conditions

High dynamic range (HDR) panoramic environment maps are widely used to illuminate virtual objects to blend with real-world scenes. However, in common applications for augmented and mixed-reality (AR/MR), capturing 360° surroundings to obtain an HDR environment map is often not possible using consume...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 27(2021), 11 vom: 15. Nov., Seite 4097-4106
1. Verfasser: Zhao, Junhong (VerfasserIn)
Weitere Verfasser: Chalmers, Andrew, Rhee, Taehyun
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2021
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM329922157
003 DE-627
005 20231225210155.0
007 cr uuu---uuuuu
008 231225s2021 xx |||||o 00| ||eng c
024 7 |a 10.1109/TVCG.2021.3106497  |2 doi 
028 5 2 |a pubmed24n1099.xml 
035 |a (DE-627)NLM329922157 
035 |a (NLM)34449381 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Zhao, Junhong  |e verfasserin  |4 aut 
245 1 0 |a Adaptive Light Estimation using Dynamic Filtering for Diverse Lighting Conditions 
264 1 |c 2021 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 28.10.2021 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a High dynamic range (HDR) panoramic environment maps are widely used to illuminate virtual objects to blend with real-world scenes. However, in common applications for augmented and mixed-reality (AR/MR), capturing 360° surroundings to obtain an HDR environment map is often not possible using consumer-level devices. We present a novel light estimation method to predict 360° HDR environment maps from a single photograph with a limited field-of-view (FOV). We introduce the Dynamic Lighting network (DLNet), a convolutional neural network that dynamically generates the convolution filters based on the input photograph sample to adaptively learn the lighting cues within each photograph. We propose novel Spherical Multi-Scale Dynamic (SMD) convolutional modules to dynamically generate sample-specific kernels for decoding features in the spherical domain to predict 360° environment maps. Using DLNet and data augmentations with respect to FOV, an exposure multiplier, and color temperature, our model shows the capability of estimating lighting under diverse input variations. Compared with prior work that fixes the network filters once trained, our method maintains lighting consistency across different exposure multipliers and color temperature, and maintains robust light estimation accuracy as FOV increases. The surrounding lighting information estimated by our method ensures coherent illumination of 3D objects blended with the input photograph, enabling high fidelity augmented and mixed reality supporting a wide range of environmental lighting conditions and device sensors 
650 4 |a Journal Article 
700 1 |a Chalmers, Andrew  |e verfasserin  |4 aut 
700 1 |a Rhee, Taehyun  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on visualization and computer graphics  |d 1996  |g 27(2021), 11 vom: 15. Nov., Seite 4097-4106  |w (DE-627)NLM098269445  |x 1941-0506  |7 nnns 
773 1 8 |g volume:27  |g year:2021  |g number:11  |g day:15  |g month:11  |g pages:4097-4106 
856 4 0 |u http://dx.doi.org/10.1109/TVCG.2021.3106497  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 27  |j 2021  |e 11  |b 15  |c 11  |h 4097-4106