Color matching images with unknown non-linear encodings

We present a color matching method that deals with different non-linear encodings. In particular, given two different views of the same scene taken by two cameras with unknown settings and internal parameters, and encoded with unknown non-linear curves, our method is able to correct the colors of on...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - (2020) vom: 28. Jan.
1. Verfasser: Rodriguez, Raquel Gil (VerfasserIn)
Weitere Verfasser: Vazquez-Corral, Javier, Bertalmio, Marcelo
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2020
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:We present a color matching method that deals with different non-linear encodings. In particular, given two different views of the same scene taken by two cameras with unknown settings and internal parameters, and encoded with unknown non-linear curves, our method is able to correct the colors of one of the images making it look as if it was captured under the other camera's settings. Our method is based on treating the in-camera color processing pipeline as a concatenation of a matrix multiplication on the linear image followed by a non-linearity. This allows us to model a color stabilization transformation among the two shots by estimating a single matrix -that will contain information from both of the original images- and an extra parameter that complies with the non-linearity. The method is fast and the results have no spurious colors. It outperforms the state-of-the-art both visually and according to several metrics, and can handle HDR encodings and very challenging real-life examples
Beschreibung:Date Revised 27.02.2024
published: Print-Electronic
Citation Status Publisher
ISSN:1941-0042
DOI:10.1109/TIP.2020.2968766