Label Consistent Matrix Factorization Hashing for Large-Scale Cross-Modal Similarity Search

Multimodal hashing has attracted much interest for cross-modal similarity search on large-scale multimedia data sets because of its efficiency and effectiveness. Recently, supervised multimodal hashing, which tries to preserve the semantic information obtained from the labels of training data, has r...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 41(2019), 10 vom: 10. Okt., Seite 2466-2479
1. Verfasser: Wang, Di (VerfasserIn)
Weitere Verfasser: Gao, Xinbo, Wang, Xiumei, He, Lihuo
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2019
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM286992027
003 DE-627
005 20231225053012.0
007 cr uuu---uuuuu
008 231225s2019 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2018.2861000  |2 doi 
028 5 2 |a pubmed24n0956.xml 
035 |a (DE-627)NLM286992027 
035 |a (NLM)30059294 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Wang, Di  |e verfasserin  |4 aut 
245 1 0 |a Label Consistent Matrix Factorization Hashing for Large-Scale Cross-Modal Similarity Search 
264 1 |c 2019 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 23.09.2019 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a Multimodal hashing has attracted much interest for cross-modal similarity search on large-scale multimedia data sets because of its efficiency and effectiveness. Recently, supervised multimodal hashing, which tries to preserve the semantic information obtained from the labels of training data, has received considerable attention for its higher search accuracy compared with unsupervised multimodal hashing. Although these algorithms are promising, they are mainly designed to preserve pairwise similarities. When semantic labels of training data are given, the algorithms often transform the labels into pairwise similarities, which gives rise to the following problems: (1) constructing pairwise similarity matrix requires enormous storage space and a large amount of calculation, making these methods unscalable to large-scale data sets; (2) transforming labels into pairwise similarities loses the category information of the training data. Therefore, these methods do not enable the hash codes to preserve the discriminative information reflected by labels and, hence, the retrieval accuracies of these methods are affected. To address these challenges, this paper introduces a simple yet effective supervised multimodal hashing method, called label consistent matrix factorization hashing (LCMFH), which focuses on directly utilizing semantic labels to guide the hashing learning procedure. Considering that relevant data from different modalities have semantic correlations, LCMFH transforms heterogeneous data into latent semantic spaces in which multimodal data from the same category share the same representation. Therefore, hash codes quantified by the obtained representations are consistent with the semantic labels of the original data and, thus, can have more discriminative power for cross-modal similarity search tasks. Thorough experiments on standard databases show that the proposed algorithm outperforms several state-of-the-art methods 
650 4 |a Journal Article 
700 1 |a Gao, Xinbo  |e verfasserin  |4 aut 
700 1 |a Wang, Xiumei  |e verfasserin  |4 aut 
700 1 |a He, Lihuo  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g 41(2019), 10 vom: 10. Okt., Seite 2466-2479  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnns 
773 1 8 |g volume:41  |g year:2019  |g number:10  |g day:10  |g month:10  |g pages:2466-2479 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2018.2861000  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 41  |j 2019  |e 10  |b 10  |c 10  |h 2466-2479