Back to Search
Start Over
Perceptron Theory Can Predict the Accuracy of Neural Networks
- Source :
- IEEE Transactions on Neural Networks and Learning Systems; 2024, Vol. 35 Issue: 7 p9885-9899, 15p
- Publication Year :
- 2024
-
Abstract
- Multilayer neural networks set the current state of the art for many technical classification problems. But, these networks are still, essentially, black boxes in terms of analyzing them and predicting their performance. Here, we develop a statistical theory for the one-layer perceptron and show that it can predict performances of a surprisingly large variety of neural networks with different architectures. A general theory of classification with perceptrons is developed by generalizing an existing theory for analyzing reservoir computing models and connectionist models for symbolic reasoning known as vector symbolic architectures. Our statistical theory offers three formulas leveraging the signal statistics with increasing detail. The formulas are analytically intractable, but can be evaluated numerically. The description level that captures maximum details requires stochastic sampling methods. Depending on the network model, the simpler formulas already yield high prediction accuracy. The quality of the theory predictions is assessed in three experimental settings, a memorization task for echo state networks (ESNs) from reservoir computing literature, a collection of classification datasets for shallow randomly connected networks, and the ImageNet dataset for deep convolutional neural networks. We find that the second description level of the perceptron theory can predict the performance of types of ESNs, which could not be described previously. Furthermore, the theory can predict deep multilayer neural networks by being applied to their output layer. While other methods for prediction of neural networks performance commonly require to train an estimator model, the proposed theory requires only the first two moments of the distribution of the postsynaptic sums in the output neurons. Moreover, the perceptron theory compares favorably to other methods that do not rely on training an estimator model.
Details
- Language :
- English
- ISSN :
- 2162237x and 21622388
- Volume :
- 35
- Issue :
- 7
- Database :
- Supplemental Index
- Journal :
- IEEE Transactions on Neural Networks and Learning Systems
- Publication Type :
- Periodical
- Accession number :
- ejs66946675
- Full Text :
- https://doi.org/10.1109/TNNLS.2023.3237381