FATE : Learning Effective Binary Descriptors with Group Fairness
Hashing has received significant interest in large-scale data retrieval due to its outstanding computational efficiency. Of late, numerous deep hashing approaches have emerged, which have obtained impressive performance. However, these approaches can contain ethical risks during image retrieval. To...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - PP(2024) vom: 04. Juni |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2024
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Hashing has received significant interest in large-scale data retrieval due to its outstanding computational efficiency. Of late, numerous deep hashing approaches have emerged, which have obtained impressive performance. However, these approaches can contain ethical risks during image retrieval. To address this, we are the first to study the problem of group fairness within learning to hash and introduce a novel method termed Fairness-aware Hashing with Mixture of Experts (FATE). Specifically, FATE leverages the mixture-of-experts framework as the hashing network, where each expert contributes knowledge from an individual viewpoint, followed by aggregation using the gating mechanism. This strongly enhances the model capability, facilitating the generation of both discriminative and unbiased binary descriptors. We also incorporate fairness-aware contrastive learning, combining sensitive labels with feature similarities to ensure unbiased hash code learning. Furthermore, an adversarial learning objective condition on both deep features and hash codes is employed to further eliminate group biases. Extensive experiments on several benchmark datasets validate the superiority of the proposed FATE compared with various state-of-the-art approaches |
---|---|
Beschreibung: | Date Revised 04.06.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2024.3406134 |