Back to Search Start Over

Optimizing dense feed-forward neural networks.

Authors :
Balderas, Luis
Lastra, Miguel
Benítez, José M.
Source :
Neural Networks. Mar2024, Vol. 171, p229-241. 13p.
Publication Year :
2024

Abstract

Deep learning models have been widely used during the last decade due to their outstanding learning and abstraction capacities. However, one of the main challenges any scientist has to face using deep learning models is to establish the network's architecture. Due to this difficulty, data scientists usually build over complex models and, as a result, most of them result computationally intensive and impose a large memory footprint, generating huge costs, contributing to climate change and hindering their use in computational-limited devices. In this paper, we propose a novel dense feed-forward neural network constructing method based on pruning and transfer learning. Its performance has been thoroughly assessed in classification and regression problems. Without any accuracy loss, our approach can compress the number of parameters by more than 70%. Even further, choosing the pruning parameter carefully, most of the refined models outperform original ones. Furthermore, we have verified that our method not only identifies a better network architecture but also facilitates knowledge transfer between the original and refined models. The results obtained show that our constructing method not only helps in the design of more efficient models but also more effective ones. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
171
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
175032178
Full Text :
https://doi.org/10.1016/j.neunet.2023.12.015