Back to Search
Start Over
Enhancing feature fusion with spatial aggregation and channel fusion for semantic segmentation
- Source :
- IET Computer Vision, Vol 15, Iss 6, Pp 418-427 (2021)
- Publication Year :
- 2021
- Publisher :
- Wiley, 2021.
-
Abstract
- Semantic segmentation is crucial to the autonomous driving, as an accurate recognition and location of the surrounding scenes can be provided for the street scenes understanding task. Many existing segmentation networks usually fuse high‐level and low‐level features to boost segmentation performance. However, the simple fusion may impose a limited performance improvement because of the gap between high‐level and low‐level features. To alleviate this limitation, we respectively propose spatial aggregation and channel fusion to bridge the gap. Our implementation, inspired by the attention mechanism, consists of two steps: (1) Spatial aggregation relies on the proposed pyramid spatial context aggregation module to capture spatial similarities to enhance the spatial representation of high‐level features, which is more effective for the latter fusion. (2) Channel fusion relies on the proposed attention‐based channel fusion module to weight channel maps on different levels to enhance the fusion. In addition, the complete network with U‐shape structure is constructed. A series of ablation experiments are conducted to demonstrate the effectiveness of our designs, and the network achieves mIoU score of 81.4% on Cityscapes test dataset and 84.6% on PASCALVOC 2012 test dataset.
- Subjects :
- Feature fusion
Fusion
Channel (digital image)
business.industry
Computer science
Computer applications to medicine. Medical informatics
R858-859.7
Pattern recognition
QA76.75-76.765
Spatial aggregation
Segmentation
Computer Vision and Pattern Recognition
Artificial intelligence
Computer software
business
Software
Subjects
Details
- Language :
- English
- ISSN :
- 17519632 and 17519640
- Volume :
- 15
- Issue :
- 6
- Database :
- OpenAIRE
- Journal :
- IET Computer Vision
- Accession number :
- edsair.doi.dedup.....cb21f1494ed18a6043c71f909fc621a7