Underwater Image Restoration Based on Image Blurriness and Light Absorption

Underwater images often suffer from color distortion and low contrast, because light is scattered and absorbed when traveling through water. Such images with different color tones can be shot in various lighting conditions, making restoration and enhancement difficult. We propose a depth estimation...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 26(2017), 4 vom: 26. Apr., Seite 1579-1594
1. Verfasser: Peng, Yan-Tsung (VerfasserIn)
Weitere Verfasser: Cosman, Pamela C
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2017
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Underwater images often suffer from color distortion and low contrast, because light is scattered and absorbed when traveling through water. Such images with different color tones can be shot in various lighting conditions, making restoration and enhancement difficult. We propose a depth estimation method for underwater scenes based on image blurriness and light absorption, which can be used in the image formation model (IFM) to restore and enhance underwater images. Previous IFM-based image restoration methods estimate scene depth based on the dark channel prior or the maximum intensity prior. These are frequently invalidated by the lighting conditions in underwater images, leading to poor restoration results. The proposed method estimates underwater scene depth more accurately. Experimental results on restoring real and synthesized underwater images demonstrate that the proposed method outperforms other IFM-based underwater image restoration methods
Beschreibung:Date Completed 30.07.2018
Date Revised 30.07.2018
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2017.2663846