Back to Search Start Over

Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks.

Authors :
Kleyko D
Kheffache M
Frady EP
Wiklund U
Osipov E
Source :
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2021 Aug; Vol. 32 (8), pp. 3777-3783. Date of Electronic Publication: 2021 Aug 03.
Publication Year :
2021

Abstract

The deployment of machine learning algorithms on resource-constrained edge devices is an important challenge from both theoretical and applied points of view. In this brief, we focus on resource-efficient randomly connected neural networks known as random vector functional link (RVFL) networks since their simple design and extremely fast training time make them very attractive for solving many applied classification tasks. We propose to represent input features via the density-based encoding known in the area of stochastic computing and use the operations of binding and bundling from the area of hyperdimensional computing for obtaining the activations of the hidden neurons. Using a collection of 121 real-world data sets from the UCI machine learning repository, we empirically show that the proposed approach demonstrates higher average accuracy than the conventional RVFL. We also demonstrate that it is possible to represent the readout matrix using only integers in a limited range with minimal loss in the accuracy. In this case, the proposed approach operates only on small n -bits integers, which results in a computationally efficient architecture. Finally, through hardware field-programmable gate array (FPGA) implementations, we show that such an approach consumes approximately 11 times less energy than that of the conventional RVFL.

Details

Language :
English
ISSN :
2162-2388
Volume :
32
Issue :
8
Database :
MEDLINE
Journal :
IEEE transactions on neural networks and learning systems
Publication Type :
Academic Journal
Accession number :
32833655
Full Text :
https://doi.org/10.1109/TNNLS.2020.3015971