A Multiscale Approach to Deep Blind Image Quality Assessment

Faithful measurement of perceptual quality is of significant importance to various multimedia applications. By fully utilizing reference images, full-reference image quality assessment (FR-IQA) methods usually achieve better prediction performance. On the other hand, no-reference image quality asses...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 32(2023) vom: 04., Seite 1656-1667
1. Verfasser: Liu, Manni (VerfasserIn)
Weitere Verfasser: Huang, Jiabin, Zeng, Delu, Ding, Xinghao, Paisley, John
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Faithful measurement of perceptual quality is of significant importance to various multimedia applications. By fully utilizing reference images, full-reference image quality assessment (FR-IQA) methods usually achieve better prediction performance. On the other hand, no-reference image quality assessment (NR-IQA), also known as blind image quality assessment (BIQA), which does not consider the reference image, makes it a challenging but important task. Previous NR-IQA methods have focused on spatial measures at the expense of information in the available frequency bands. In this paper, we present a multiscale deep blind image quality assessment method (BIQA, M.D.) with spatial optimal-scale filtering analysis. Motivated by the multi-channel behavior of the human visual system and contrast sensitivity function, we decompose an image into a number of spatial frequency bands through multiscale filtering and extract features to map an image to its subjective quality score by applying convolutional neural network. Experimental results show that BIQA, M.D. compares well with existing NR-IQA methods and generalizes well across datasets
Beschreibung:Date Revised 04.04.2025
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2023.3245991