Back to Search Start Over

Regular Polytope Networks.

Authors :
Pernici, Federico
Bruni, Matteo
Baecchi, Claudio
Bimbo, Alberto Del
Source :
IEEE Transactions on Neural Networks & Learning Systems; Sep2022, Vol. 33 Issue 9, p4373-4387, 15p
Publication Year :
2022

Abstract

Neural networks are widely used as a model for classification in a large variety of tasks. Typically, a learnable transformation (i.e., the classifier) is placed at the end of such models returning a value for each class used for classification. This transformation plays an important role in determining how the generated features change during the learning process. In this work, we argue that this transformation not only can be fixed (i.e., set as nontrainable) with no loss of accuracy and with a reduction in memory usage, but it can also be used to learn stationary and maximally separated embeddings. We show that the stationarity of the embedding and its maximal separated representation can be theoretically justified by setting the weights of the fixed classifier to values taken from the coordinate vertices of the three regular polytopes available in $\mathbb {R}^{d}$ , namely, the $d$ -Simplex, the $d$ -Cube, and the $d$ -Orthoplex. These regular polytopes have the maximal amount of symmetry that can be exploited to generate stationary features angularly centered around their corresponding fixed weights. Our approach improves and broadens the concept of a fixed classifier, recently proposed by Hoffer et al., to a larger class of fixed classifier models. Experimental results confirm the theoretical analysis, the generalization capability, the faster convergence, and the improved performance of the proposed method. Code will be publicly available. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
33
Issue :
9
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
158869790
Full Text :
https://doi.org/10.1109/TNNLS.2021.3056762