Deep Dichromatic Model Estimation Under AC Light Sources

The dichromatic reflection model has been popularly exploited for computer vison tasks, such as color constancy and highlight removal. However, dichromatic model estimation is an severely ill-posed problem. Thus, several assumptions have been commonly made to estimate the dichromatic model, such as...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 30(2021) vom: 01., Seite 7064-7073
1. Verfasser: Yoo, Jun-Sang (VerfasserIn)
Weitere Verfasser: Lee, Chan-Ho, Kim, Jong-Ok
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2021
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:The dichromatic reflection model has been popularly exploited for computer vison tasks, such as color constancy and highlight removal. However, dichromatic model estimation is an severely ill-posed problem. Thus, several assumptions have been commonly made to estimate the dichromatic model, such as white-light (highlight removal) and the existence of highlight regions (color constancy). In this paper, we propose a spatio-temporal deep network to estimate the dichromatic parameters under AC light sources. The minute illumination variations can be captured with high-speed camera. The proposed network is composed of two sub-network branches. From high-speed video frames, each branch generates chromaticity and coefficient matrices, which correspond to the dichromatic image model. These two separate branches are jointly learned by spatio-temporal regularization. As far as we know, this is the first work that aims to estimate all dichromatic parameters in computer vision. To validate the model estimation accuracy, it is applied to color constancy and highlight removal. Both experimental results show that the dichromatic model can be estimated accurately via the proposed deep network
Beschreibung:Date Revised 11.08.2021
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2021.3100550