Mutual Learning Between Saliency and Similarity : Image Cosegmentation via Tree Structured Sparsity and Tree Graph Matching
This paper proposes a unified mutual learning framework based on image hierarchies, which integrates structured sparsity with tree-graph matching to conquer the problem of weakly supervised image cosegmentation. We focus on the interaction between two common-object properties: saliency and similarit...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 27(2018), 9 vom: 03. Sept., Seite 4690-4704 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2018
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | This paper proposes a unified mutual learning framework based on image hierarchies, which integrates structured sparsity with tree-graph matching to conquer the problem of weakly supervised image cosegmentation. We focus on the interaction between two common-object properties: saliency and similarity. Most existing cosegmentation methods only pay emphasis on either of them. The proposed method realizes the learning of the prior knowledge for structured sparsity with the help of treegraph matching, which is capable of generating object-oriented salient regions. Meanwhile, it also reduces the searching space and computational complexity of tree-graph matching with the attendance of structured sparsity. We intend to thoughtfully exploit the hierarchically geometrical relationships of coherent objects. Experimental results compared with the state-of-thearts on benchmark datasets confirm that the mutual learning framework are capable of effectively delineating co-existing object patterns in multiple images |
---|---|
Beschreibung: | Date Revised 20.11.2019 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2018.2842207 |