Back to Search Start Over

On the Representational Power of Restricted Boltzmann Machines for Symmetric Functions and Boolean Functions.

Authors :
Gu, Linyan
Huang, Jianfeng
Yang, Lihua
Source :
IEEE Transactions on Neural Networks & Learning Systems. May2019, Vol. 30 Issue 5, p1335-1347. 13p.
Publication Year :
2019

Abstract

Restricted Boltzmann machines (RBMs) are used to build deep-belief networks that are widely thought to be one of the first effective deep learning neural networks. This paper studies the ability of RBMs to represent distributions over $\{0,1\}^{n}$ via softplus/hardplus RBM networks. It is shown that any distribution whose density depends on the number of 1’s in their input can be approximated with arbitrarily high accuracy by an RBM of size $2n+1$ , which improves the result of a previous study by reducing the size from $n^{2}$ to $2n+1$. A theorem for representing partially symmetric Boolean functions by softplus RBM networks is established. Accordingly, the representational power of RBMs for distributions whose mass represents the Boolean functions is investigated in comparison with that of threshold circuits and polynomial threshold functions. It is shown that a distribution over $\{0,1\}^{n}$ whose mass represents a Boolean function can be computed with a given margin $\delta$ by an RBM of size and parameters bounded by polynomials in $n$ , if and only if it can be computed by a depth-2 threshold circuit with size and parameters bounded by polynomials in $n$. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
30
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
136117571
Full Text :
https://doi.org/10.1109/TNNLS.2018.2868809