Class Agnostic Image Common Object Detection
Learning similarity of two images is an important problem in computer vision and has many potential applications. Most of previous works focus on generating image similarities in three aspects: global feature distance computing, local feature matching and image concepts comparison. However, the task...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - (2019) vom: 09. Jan. |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2019
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Learning similarity of two images is an important problem in computer vision and has many potential applications. Most of previous works focus on generating image similarities in three aspects: global feature distance computing, local feature matching and image concepts comparison. However, the task of directly detecting class agnostic common objects from two images has not been studied before, which goes one step further to capture image similarities at region level. In this paper, we propose an end-to-end Image Common Object Detection Network (CODN) to detect class agnostic common objects from two images. The proposed method consists of two main modules: locating module and matching module. The locating module generates candidate proposals of each two images. The matching module learns the similarities of the candidate proposal pairs from two images, and refines the bounding boxes of the candidate proposals. The learning procedure of CODN is implemented in an integrated way and a multi-task loss is designed to guarantee both region localization and common object matching. Experiments are conducted on PASCAL VOC 2007 and COCO 2014 datasets. Experimental results validate the effectiveness of the proposed method |
---|---|
Beschreibung: | Date Revised 27.02.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2019.2891124 |