1. Density of states in neural networks: an in-depth exploration of learning in parameter space
- Author
-
Mele, Margherita, Menichetti, Roberto, Ingrosso, Alessandro, and Potestio, Raffaello
- Subjects
Condensed Matter - Statistical Mechanics - Abstract
Learning in neural networks critically hinges on the intricate geometry of the loss landscape associated with a given task. Traditionally, most research has focused on finding specific weight configurations that minimize the loss. In this work, born from the cross-fertilization of machine learning and theoretical soft matter physics, we introduce a novel, computationally efficient approach to examine the weight space across all loss values. Employing the Wang-Landau enhanced sampling algorithm, we explore the neural network density of states - the number of network parameter configurations that produce a given loss value - and analyze how it depends on specific features of the training set. Using both real-world and synthetic data, we quantitatively elucidate the relation between data structure and network density of states across different sizes and depths of binary-state networks.
- Published
- 2024