Back to Search Start Over

Heterogeneous quantization regularizes spiking neural network activity

Authors :
Moyal, Roy
Mama, Kyrus R.
Einhorn, Matthew
Borthakur, Ayon
Cleland, Thomas A.
Publication Year :
2024

Abstract

The learning and recognition of object features from unregulated input has been a longstanding challenge for artificial intelligence systems. Brains are adept at learning stable representations given small samples of noisy observations; across sensory modalities, this capacity is aided by a cascade of signal conditioning steps informed by domain knowledge. The olfactory system, in particular, solves a source separation and denoising problem compounded by concentration variability, environmental interference, and unpredictably correlated sensor affinities. To function optimally, its plastic network requires statistically well-behaved input. We present a data-blind neuromorphic signal conditioning strategy whereby analog data are normalized and quantized into spike phase representations. Input is delivered to a column of duplicated spiking principal neurons via heterogeneous synaptic weights; this regularizes layer utilization, yoking total activity to the network's operating range and rendering internal representations robust to uncontrolled open-set stimulus variance. We extend this mechanism by adding a data-aware calibration step whereby the range and density of the quantization weights adapt to accumulated input statistics, optimizing resource utilization by balancing activity regularization and information retention.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.18396
Document Type :
Working Paper