Deep Bilateral Filtering Network for Point-Supervised Semantic Segmentation in Remote Sensing Images

Semantic segmentation methods based on deep neural networks have achieved great success in recent years. However, training such deep neural networks relies heavily on a large number of images with accurate pixel-level labels, which requires a huge amount of human effort, especially for large-scale r...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 31(2022) vom: 23., Seite 7419-7434
1. Verfasser: Wu, Linshan (VerfasserIn)
Weitere Verfasser: Fang, Leyuan, Yue, Jun, Zhang, Bob, Ghamisi, Pedram, He, Min
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2022
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM34930825X
003 DE-627
005 20231226042317.0
007 cr uuu---uuuuu
008 231226s2022 xx |||||o 00| ||eng c
024 7 |a 10.1109/TIP.2022.3222904  |2 doi 
028 5 2 |a pubmed24n1164.xml 
035 |a (DE-627)NLM34930825X 
035 |a (NLM)36417727 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Wu, Linshan  |e verfasserin  |4 aut 
245 1 0 |a Deep Bilateral Filtering Network for Point-Supervised Semantic Segmentation in Remote Sensing Images 
264 1 |c 2022 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 02.12.2022 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a Semantic segmentation methods based on deep neural networks have achieved great success in recent years. However, training such deep neural networks relies heavily on a large number of images with accurate pixel-level labels, which requires a huge amount of human effort, especially for large-scale remote sensing images. In this paper, we propose a point-based weakly supervised learning framework called the deep bilateral filtering network (DBFNet) for the semantic segmentation of remote sensing images. Compared with pixel-level labels, point annotations are usually sparse and cannot reveal the complete structure of the objects; they also lack boundary information, thus resulting in incomplete prediction within the object and the loss of object boundaries. To address these problems, we incorporate the bilateral filtering technique into deeply learned representations in two respects. First, since a target object contains smooth regions that always belong to the same category, we perform deep bilateral filtering (DBF) to filter the deep features by a nonlinear combination of nearby feature values, which encourages the nearby and similar features to become closer, thus achieving a consistent prediction in the smooth region. In addition, the DBF can distinguish the boundary by enlarging the distance between the features on different sides of the edge, thus preserving the boundary information well. Experimental results on two widely used datasets, the ISPRS 2-D semantic labeling Potsdam and Vaihingen datasets, demonstrate that our proposed DBFNet can achieve a highly competitive performance compared with state-of-the-art fully-supervised methods. Code is available at https://github.com/Luffy03/DBFNet 
650 4 |a Journal Article 
700 1 |a Fang, Leyuan  |e verfasserin  |4 aut 
700 1 |a Yue, Jun  |e verfasserin  |4 aut 
700 1 |a Zhang, Bob  |e verfasserin  |4 aut 
700 1 |a Ghamisi, Pedram  |e verfasserin  |4 aut 
700 1 |a He, Min  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on image processing : a publication of the IEEE Signal Processing Society  |d 1992  |g 31(2022) vom: 23., Seite 7419-7434  |w (DE-627)NLM09821456X  |x 1941-0042  |7 nnns 
773 1 8 |g volume:31  |g year:2022  |g day:23  |g pages:7419-7434 
856 4 0 |u http://dx.doi.org/10.1109/TIP.2022.3222904  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 31  |j 2022  |b 23  |h 7419-7434