Back to Search Start Over

A step function based recursion method for 0/1 deep neural networks.

Authors :
Zhang, Hui
Zhou, Shenglong
Li, Geoffrey Ye
Xiu, Naihua
Wang, Yiju
Source :
Applied Mathematics & Computation. Mar2025, Vol. 488, pN.PAG-N.PAG. 1p.
Publication Year :
2025

Abstract

The deep neural network with step function activation (0/1 DNNs) is a fundamental composite model in deep learning which has high efficiency and robustness to outliers. However, due to the discontinuity and lacking subgradient information of the 0/1 DNNs model, prior researches are largely focused on designing continuous functions to approximate the step activation and developing continuous optimization methods. In this paper, by introducing two sets of network node variables into the 0/1 DNNs and by exploring the composite structure of the resulted model, the 0/1 DNNs is decomposed into a unary optimization model associated with the step function and three derivational optimization subproblems associated with the other variables. For the unary optimization model and two derivational optimization subproblems, we present a closed form solution, and for the third derivational optimization subproblem, we propose an efficient proximal method. Based on this, a globally convergent step function based recursion method for the 0/1 DNNs is developed. The efficiency and performance of the proposed algorithm are validated via theoretical analysis as well as some illustrative numerical examples on classifying MNIST, FashionMNIST and Cifar10 datasets. • We analyse the relevant optimal conditions for 0/1 deep neural networks. • We develop a globally convergent step function based recurtion method for solving 0/1 deep neural networks. • We investigate the proposed method on three classifying datasets: MNIST, FashionMNIST and Cifar10. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00963003
Volume :
488
Database :
Academic Search Index
Journal :
Applied Mathematics & Computation
Publication Type :
Academic Journal
Accession number :
181036446
Full Text :
https://doi.org/10.1016/j.amc.2024.129129