Scalable Face Image Retrieval with Identity-Based Quantization and Multireference Reranking

State-of-the-art image retrieval systems achieve scalability by using a bag-of-words representation and textual retrieval methods, but their performance degrades quickly in the face image domain, mainly because they produce visual words with low discriminative power for face images and ignore the sp...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 33(2011), 10 vom: 07. Okt., Seite 1991-2001
1. Verfasser: Wu, Zhong (VerfasserIn)
Weitere Verfasser: Ke, Qifa, Sun, Jian, Shum, Heung-Yeung
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2011
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:State-of-the-art image retrieval systems achieve scalability by using a bag-of-words representation and textual retrieval methods, but their performance degrades quickly in the face image domain, mainly because they produce visual words with low discriminative power for face images and ignore the special properties of faces. The leading features for face recognition can achieve good retrieval performance, but these features are not suitable for inverted indexing as they are high-dimensional and global and thus not scalable in either computational or storage cost. In this paper, we aim to build a scalable face image retrieval system. For this purpose, we develop a new scalable face representation using both local and global features. In the indexing stage, we exploit special properties of faces to design new component-based local features, which are subsequently quantized into visual words using a novel identity-based quantization scheme. We also use a very small Hamming signature (40 bytes) to encode the discriminative global feature for each face. In the retrieval stage, candidate images are first retrieved from the inverted index of visual words. We then use a new multireference distance to rerank the candidate images using the Hamming signature. On a one millon face database, we show that our local features and global Hamming signatures are complementary--the inverted index based on local features provides candidate images with good recall, while the multireference reranking with global Hamming signature leads to good precision. As a result, our system is not only scalable but also outperforms the linear scan retrieval system using the state-of-the-art face recognition feature in term of the quality
Beschreibung:Date Completed 11.03.2016
Date Revised 01.03.2022
published: Print-Electronic
Citation Status MEDLINE
ISSN:1939-3539
DOI:10.1109/TPAMI.2011.111