Multi-Image Blind Super-Resolution of 3D Scenes

We address the problem of estimating the latent high-resolution (HR) image of a 3D scene from a set of non-uniformly motion blurred low-resolution (LR) images captured in the burst mode using a hand-held camera. Existing blind super-resolution (SR) techniques that account for motion blur are restric...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 26(2017), 11 vom: 01. Nov., Seite 5337-5352
1. Verfasser: Punnappurath, Abhijith (VerfasserIn)
Weitere Verfasser: Nimisha, Thekke Madam, Rajagopalan, Ambasamudram Narayanan
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2017
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:We address the problem of estimating the latent high-resolution (HR) image of a 3D scene from a set of non-uniformly motion blurred low-resolution (LR) images captured in the burst mode using a hand-held camera. Existing blind super-resolution (SR) techniques that account for motion blur are restricted to fronto-parallel planar scenes. We initially develop an SR motion blur model to explain the image formation process in 3D scenes. We then use this model to solve for the three unknowns-the camera trajectories, the depth map of the scene, and the latent HR image. We first compute the global HR camera motion corresponding to each LR observation from patches lying on a reference depth layer in the input images. Using the estimated trajectories, we compute the latent HR image and the underlying depth map iteratively using an alternating minimization framework. Experiments on synthetic and real data reveal that our proposed method outperforms the state-of-the-art techniques by a significant margin
Beschreibung:Date Completed 30.07.2018
Date Revised 30.07.2018
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2017.2723243