Back to Search
Start Over
ICNet: Information Conversion Network for RGB-D Based Salient Object Detection.
- Source :
-
IEEE Transactions on Image Processing . 2020, Vol. 29, p4873-4884. 12p. - Publication Year :
- 2020
-
Abstract
- RGB-D based salient object detection (SOD) methods leverage the depth map as a valuable complementary information for better SOD performance. Previous methods mainly resort to exploit the correlation between RGB image and depth map in three fusion domains: input images, extracted features, and output results. However, these fusion strategies cannot fully capture the complex correlation between the RGB image and depth map. Besides, these methods do not fully explore the cross-modal complementarity and the cross-level continuity of information, and treat information from different sources without discrimination. In this paper, to address these problems, we propose a novel Information Conversion Network (ICNet) for RGB-D based SOD by employing the siamese structure with encoder-decoder architecture. To fuse high-level RGB and depth features in an interactive and adaptive way, we propose a novel Information Conversion Module (ICM), which contains concatenation operations and correlation layers. Furthermore, we design a Cross-modal Depth-weighted Combination (CDC) block to discriminate the cross-modal features from different sources and to enhance RGB features with depth features at each level. Extensive experiments on five commonly tested datasets demonstrate the superiority of our ICNet over 15 state-of-the-art RGB-D based SOD methods, and validate the effectiveness of the proposed ICM and CDC block. [ABSTRACT FROM AUTHOR]
- Subjects :
- *INFORMATION networks
*DECODING algorithms
*FEATURE extraction
*VIDEO coding
Subjects
Details
- Language :
- English
- ISSN :
- 10577149
- Volume :
- 29
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Image Processing
- Publication Type :
- Academic Journal
- Accession number :
- 170078310
- Full Text :
- https://doi.org/10.1109/TIP.2020.2976689