Back to Search Start Over

Improving Network Training on Resource-Constrained Devices via Habituation Normalization †.

Authors :
Lai, Huixia
Zhang, Lulu
Zhang, Shi
Source :
Sensors (14248220). Dec2022, Vol. 22 Issue 24, p9940. 16p.
Publication Year :
2022

Abstract

As a technique for accelerating and stabilizing training, the batch normalization (BN) is widely used in deep learning. However, BN cannot effectively estimate the mean and the variance of samples when training/fine-tuning with small batches of data on resource-constrained devices. It will lead to a decrease in the accuracy of the deep learning model. In the fruit fly olfactory system, the algorithm based on the "negative image" habituation model can filter redundant information and improve numerical stability. Inspired by the circuit mechanism, we propose a novel normalization method, the habituation normalization (HN). HN first eliminates the "negative image" obtained by habituation and then calculates the statistics for normalizing. It solves the problem of accuracy degradation of BN when the batch size is small. The experiment results show that HN can speed up neural network training and improve the model accuracy on vanilla LeNet-5, VGG16, and ResNet-50 in the Fashion MNIST and CIFAR10 datasets. Compared with four standard normalization methods, HN keeps stable and high accuracy in different batch sizes, which shows that HN has strong robustness. Finally, the applying HN to the deep learning-based EEG signal application system indicates that HN is suitable for the network fine-tuning and neural network applications under limited computing power and memory. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14248220
Volume :
22
Issue :
24
Database :
Academic Search Index
Journal :
Sensors (14248220)
Publication Type :
Academic Journal
Accession number :
161002806
Full Text :
https://doi.org/10.3390/s22249940