Design and estimation of coded exposure point spread functions

We address the problem of motion deblurring using coded exposure. This approach allows for accurate estimation of a sharp latent image via well-posed deconvolution and avoids lost image content that cannot be recovered from images acquired with a traditional shutter. Previous work in this area has u...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 34(2012), 10 vom: 15. Okt., Seite 2071-7
1. Verfasser: McCloskey, Scott (VerfasserIn)
Weitere Verfasser: Ding, Yuanyuan, Yu, Jingyi
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2012
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article Research Support, U.S. Gov't, Non-P.H.S.
Beschreibung
Zusammenfassung:We address the problem of motion deblurring using coded exposure. This approach allows for accurate estimation of a sharp latent image via well-posed deconvolution and avoids lost image content that cannot be recovered from images acquired with a traditional shutter. Previous work in this area has used either manual user input or alpha matting approaches to estimate the coded exposure Point Spread Function (PSF) from the captured image. In order to automate deblurring and to avoid the limitations of matting approaches, we propose a Fourier-domain statistical approach to coded exposure PSF estimation that allows us to estimate the latent image in cases of constant velocity, constant acceleration, and harmonic motion. We further demonstrate that previously used criteria to choose a coded exposure PSF do not produce one with optimal reconstruction error, and that an additional 30 percent reduction in Root Mean Squared Error (RMSE) of the latent image estimate can be achieved by incorporating natural image statistics
Beschreibung:Date Completed 04.03.2013
Date Revised 06.12.2012
published: Print
Citation Status PubMed-not-MEDLINE
ISSN:1939-3539