1. Activation thresholds and expressiveness of polynomial neural networks
- Author
-
Finkel, Bella, Rodriguez, Jose Israel, Wu, Chenxi, and Yahl, Thomas
- Subjects
Computer Science - Machine Learning ,Computer Science - Neural and Evolutionary Computing ,Mathematics - Algebraic Geometry ,Statistics - Machine Learning - Abstract
Polynomial neural networks have been implemented in a range of applications and present an advantageous framework for theoretical machine learning. A polynomial neural network of fixed architecture and activation degree gives an algebraic map from the network's weights to a set of polynomials. The image of this map is the space of functions representable by the network. Its Zariski closure is an affine variety known as a neurovariety. The dimension of a polynomial neural network's neurovariety provides a measure of its expressivity. In this work, we introduce the notion of the activation threshold of a network architecture which expresses when the dimension of a neurovariety achieves its theoretical maximum. In addition, we prove expressiveness results for polynomial neural networks with equi-width~architectures., Comment: 13 pages
- Published
- 2024