1. Scalable Thermodynamic Second-order Optimization
- Author
-
Donatella, Kaelan, Duffield, Samuel, Melanson, Denis, Aifer, Maxwell, Klett, Phoebe, Salegame, Rajath, Belateche, Zach, Crooks, Gavin, Martinez, Antonio J., and Coles, Patrick J.
- Subjects
Computer Science - Emerging Technologies ,Computer Science - Machine Learning - Abstract
Many hardware proposals have aimed to accelerate inference in AI workloads. Less attention has been paid to hardware acceleration of training, despite the enormous societal impact of rapid training of AI models. Physics-based computers, such as thermodynamic computers, offer an efficient means to solve key primitives in AI training algorithms. Optimizers that normally would be computationally out-of-reach (e.g., due to expensive matrix inversions) on digital hardware could be unlocked with physics-based hardware. In this work, we propose a scalable algorithm for employing thermodynamic computers to accelerate a popular second-order optimizer called Kronecker-factored approximate curvature (K-FAC). Our asymptotic complexity analysis predicts increasing advantage with our algorithm as $n$, the number of neurons per layer, increases. Numerical experiments show that even under significant quantization noise, the benefits of second-order optimization can be preserved. Finally, we predict substantial speedups for large-scale vision and graph problems based on realistic hardware characteristics., Comment: 17 pages, 5 figures
- Published
- 2025