Back to Search Start Over

Training feedforward neural networks with Bayesian hyper-heuristics.

Authors :
Schreuder, A.N.
Bosman, A.S.
Engelbrecht, A.P.
Cleghorn, C.W.
Source :
Information Sciences. Jan2025, Vol. 686, pN.PAG-N.PAG. 1p.
Publication Year :
2025

Abstract

The process of training feedforward neural networks (FFNNs) can benefit from an automated process where the best heuristic to train the network is sought out automatically by means of a high-level probabilistic-based heuristic. This research introduces a novel population-based Bayesian hyper-heuristic (BHH) that is used to train feedforward neural networks (FFNNs). The performance of the BHH is compared to that of ten popular low-level heuristics, each with different search behaviours. The chosen heuristic pool consists of classic gradient-based heuristics as well as meta-heuristics (MHs). The empirical process is executed on fourteen datasets consisting of classification and regression problems with varying characteristics. The BHH is shown to be able to train FFNNs well and provide an automated method for finding the best heuristic to train the FFNNs at various stages of the training process. • A novel Bayesian hyper-heuristic (BHH) is developed and shown to efficiently train feedforward neural networks (FFNNs). • The BHH shows statistically significant performance on multiple problems, comparable to the best heuristics. • The BHH produces good results with a diverse set of low-level heuristics across multiple problems. • The BHH automates heuristic selection for FFNN training, reducing manual trial and error. • The BHH can utilise a priori 1 1 Latin word, meaning "from what comes before". knowledge for low-level heuristics selection on specific problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
686
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
179463827
Full Text :
https://doi.org/10.1016/j.ins.2024.121363