Back to Search
Start Over
Scaled gated networks.
- Source :
-
World Wide Web . Jul2022, Vol. 25 Issue 4, p1583-1606. 24p. - Publication Year :
- 2022
-
Abstract
- Gating transformation demonstrates great potential in recent deep convolutional neural networks design, enriching the feature representation and alleviating noisy signals by modeling the inter-channel dependencies using learnable parameters. However, the utilization of scaling approaches to reduce the redundancy of the hand-crafted attention mechanism has rarely been investigated. This paper proposes a novel scaled gated convolution that enables attention-enhanced CNNs to overcome the paradox between performance and redundancy. Our scaled gated convolution is a simple and effective alternative compared with both vanilla convolution and attention-enhanced convolutions, which can be easily applied to modern CNNs in a plug-and-play manner. Exhaustive experiments demonstrate that stacking scaled gated convolutions in baselines can significantly improve the performance in a broad range of visual recognition tasks, including image recognition, object detection, instance segmentation, keypoint detection, and panoptic segmentation, while obtaining a better trade-off between performance and attentive redundancy. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1386145X
- Volume :
- 25
- Issue :
- 4
- Database :
- Academic Search Index
- Journal :
- World Wide Web
- Publication Type :
- Academic Journal
- Accession number :
- 158179524
- Full Text :
- https://doi.org/10.1007/s11280-021-00968-2