Back to Search Start Over

DL-Reg: A deep learning regularization technique using linear regression.

Authors :
Dialameh, Maryam
Hamzeh, Ali
Rahmani, Hossein
Dialameh, Safoura
Kwon, Hyock Ju
Source :
Expert Systems with Applications. Aug2024, Vol. 247, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Regularization is an essential aspect in the context of deep learning as it mitigates the risk of overfitting in deep neural networks. This study presents a novel deep learning regularization method, referred to DL-Reg, which effectively reduces the nonlinearity of deep networks by enforcing linearity to a certain extent. The proposed method is based on the incorporation of a linear constraint into the objective function of deep neural networks, which is defined as the error of a linear mapping from the inputs to the outputs of the model. Specifically, DL-Reg imposes a linear constraint on the network, which is further adjusted by a regularization factor, thereby preventing the network from overfitting. The effectiveness of DL-Reg is evaluated by training state-of-the-art deep network models on several benchmark datasets. The results of the experiments demonstrate that the proposed regularization method provides significant improvements over existing regularization techniques and enhances the performance of deep neural networks, particularly when dealing with small-sized training datasets. The main code of DL-Reg written in PyTorch is also available here: https://github.com/m2dgithub/DL-Reg.git. • Regularizing deep networks using a linear regression model. • Allowing supervised deep networks to be trained using the semi-supervised learning. • Increasing the performance of deep networks on the small-sized dataset. • Conducting extensive experiments to certify the significance of the proposed method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
247
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
176407623
Full Text :
https://doi.org/10.1016/j.eswa.2024.123182