Real-Time Shadow Detection From Live Outdoor Videos for Augmented Reality

Simulating shadow interactions between real and virtual objects is important for augmented reality (AR), in which accurately and efficiently detecting real shadows from live videos is a crucial step. Most of the existing methods are capable of processing only scenes captured under a fixed viewpoint....

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 28(2022), 7 vom: 17. Juli, Seite 2748-2763
1. Verfasser: Liu, Yanli (VerfasserIn)
Weitere Verfasser: Zou, Xingming, Xu, Songhua, Xing, Guanyu, Wei, Housheng, Zhang, Yanci
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2022
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article Research Support, Non-U.S. Gov't
Beschreibung
Zusammenfassung:Simulating shadow interactions between real and virtual objects is important for augmented reality (AR), in which accurately and efficiently detecting real shadows from live videos is a crucial step. Most of the existing methods are capable of processing only scenes captured under a fixed viewpoint. In contrast, this article proposes a new framework for shadow detection in live outdoor videos captured under moving viewpoints. The framework splits each frame into a tracked region, which is the region tracked from the previous video frame through optical flow analysis, and an emerging region, which is newly introduced into the scene due to the moving viewpoint. The framework subsequently extracts features based on the intensity profiles surrounding the boundaries of candidate shadow regions. These features are then utilized to both correct erroneous shadow boundaries for the tracked region and to detect shadow boundaries for the emerging region by a Bayesian learning module. To remove spurious shadows, spatial layout constraints are further considered for emerging regions. The experimental results demonstrate that the proposed framework outperforms the state-of-the-art shadow tracking and detection algorithms on a variety of challenging cases in real time, including shadows on backgrounds with complex textures, nonplanar shadows, fast-moving shadows with changing typologies, and shadows cast by nonrigid objects. The quantitative experiments show that our method outperforms the best existing method, achieving a 33.3% increase in the average Fmeasure on a self-collected database. Coupled with an image-based shadow-casting method, the proposed framework generates realistic shadow interaction results. This capability will be particularly beneficial for supporting AR applications
Beschreibung:Date Completed 30.05.2022
Date Revised 27.06.2022
published: Print-Electronic
Citation Status MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2020.3041100