Back to Search Start Over

Neuromorphic Architecture Optimization for Task-Specific Dynamic Learning

Authors :
Madireddy, Sandeep
Yanguas-Gil, Angel
Balaprakash, Prasanna
Source :
Proceedings of the International Conference on Neuromorphic Systems 2019. ACM, New York, NY, USA, Article 5, 5 pages
Publication Year :
2019

Abstract

The ability to learn and adapt in real time is a central feature of biological systems. Neuromorphic architectures demonstrating such versatility can greatly enhance our ability to efficiently process information at the edge. A key challenge, however, is to understand which learning rules are best suited for specific tasks and how the relevant hyperparameters can be fine-tuned. In this work, we introduce a conceptual framework in which the learning process is integrated into the network itself. This allows us to cast meta-learning as a mathematical optimization problem. We employ DeepHyper, a scalable, asynchronous model-based search, to simultaneously optimize the choice of meta-learning rules and their hyperparameters. We demonstrate our approach with two different datasets, MNIST and FashionMNIST, using a network architecture inspired by the learning center of the insect brain. Our results show that optimal learning rules can be dataset-dependent even within similar tasks. This dependency demonstrates the importance of introducing versatility and flexibility in the learning algorithms. It also illuminates experimental findings in insect neuroscience that have shown a heterogeneity of learning rules within the insect mushroom body.

Details

Database :
arXiv
Journal :
Proceedings of the International Conference on Neuromorphic Systems 2019. ACM, New York, NY, USA, Article 5, 5 pages
Publication Type :
Report
Accession number :
edsarx.1906.01668
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3354265.3354270