Mesh-guided optimized retexturing for image and video

This paper presents an approach of replacing textures of specified regions in the input image and video using stretch-based mesh optimization.. The retexturing results have the similar distortion and shading effects conforming to the underlying geometry and lighting conditions. For replacing texture...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 14(2008), 2 vom: 15. März, Seite 426-39
1. Verfasser: Guo, Yanwen (VerfasserIn)
Weitere Verfasser: Sun, Hanqiu, Peng, Qunsheng, Jiang, Zhongding
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2008
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:This paper presents an approach of replacing textures of specified regions in the input image and video using stretch-based mesh optimization.. The retexturing results have the similar distortion and shading effects conforming to the underlying geometry and lighting conditions. For replacing textures in single image,two important steps are developed: the stretch-based mesh parametrization incorporating the recovered normal information is deduced to imitate perspective distortion of the region of interest; the Poisson-based refinement process is exploited to account for texture distortion at fine scale. The luminance of the input image is preserved through color transfer in YCbCr color space. Our approach is independent of the replaced textures. Once the input image is processed, any new texture can be applied to efficiently generate the retexturing results. For video retexturing, we propose key-frame-based texture replacement extended and generalized from the image retexturing. Our approach repeatedly propagates the replacement result of key frame to the rest of the frames. We develop the local motion optimization scheme to deal with the inaccuracies and errors of robust optical flow when tracking moving objects. Visibility shifting and texture drifting are effectively alleviated using graphcut segmentation algorithm and the global optimization to smooth trajectories of the tracked points over temporal domain. Our experimental results showed that the proposed approach can generate visually pleasing results for both image and video
Beschreibung:Date Completed 02.04.2008
Date Revised 03.11.2009
published: Print
Citation Status PubMed-not-MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2007.70438