Back to Search Start Over

Convolutional Networks with Dense Connectivity.

Authors :
Huang, Gao
Liu, Zhuang
Pleiss, Geoff
Maaten, Laurens van der
Weinberger, Kilian Q.
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Dec2022, Vol. 44 Issue 12, p8704-8716. 13p.
Publication Year :
2022

Abstract

Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion. Whereas traditional convolutional networks with $L$ L layers have $L$ L connections—one between each layer and its subsequent layer—our network has $\frac{L(L+1)}{2}$ L (L + 1) 2 direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, encourage feature reuse and substantially improve parameter efficiency. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant improvements over the state-of-the-art on most of them, whilst requiring less parameters and computation to achieve high performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
44
Issue :
12
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
160650721
Full Text :
https://doi.org/10.1109/TPAMI.2019.2918284