1. Biologically-inspired neuronal adaptation improves learning in neural networks.
- Author
-
Yoshimasa Kubo, Chalmers, Eric, and Luczaka, Artur
- Subjects
- *
NEUROPLASTICITY , *MACHINE learning , *ARTIFICIAL neural networks , *MULTILAYER perceptrons , *CONVOLUTIONAL neural networks - Abstract
Since humans still outperform artificial neural networks on many tasks, drawing inspiration from the brain may help to improve current machine learning algorithms. Contrastive Hebbian learning (CHL) and equilibrium propagation (EP) are biologically plausible algorithms that update weights using only local information (without explicitly calculating gradients) and still achieve performance comparable to conventional backpropagation. In this study, we augmented CHL and EP with Adjusted Adaptation, inspired by the adaptation effect observed in neurons, in which a neuron's response to a given stimulus is adjusted after a short time. We add this adaptation feature to multilayer perceptrons and convolutional neural networks trained on MNIST and CIFAR- 10. Surprisingly, adaptation improved the performance of these networks. We discuss the biological inspiration for this idea and investigate why Neuronal Adaptation could be an important brain mechanism to improve the stability and accuracy of learning. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF