1. Boosting the training of neural networks through hybrid metaheuristics.
- Author
-
Al-Betar, Mohammed Azmi, Awadallah, Mohammed A., Doush, Iyad Abu, Alomari, Osama Ahmad, Abasi, Ammar Kamal, Makhadmeh, Sharif Naser, and Alyasseri, Zaid Abdi Alkareem
- Subjects
- *
METAHEURISTIC algorithms , *POLLINATION , *PARTICLE swarm optimization , *SEARCH algorithms , *BENCHMARK problems (Computer science) - Abstract
In this paper, the learning process of multilayer perceptron (MLP) neural network is boosted using hybrid metaheuristic optimization algorithms. Normally, the learning process in MLP requires suitable settings of its weight and bias parameters. In the original version of MLP, the gradient descent algorithm is used as a learner in MLP which suffers from two chronic problems: local minima and slow convergence. In this paper, six versions of memetic algorithms (MAs) are proposed to replace gradient descent learning mechanism of MLP where adaptive β -hill climbing (A β HC) as a local search algorithm is hybridized with six population-based metaheuristics which are hybrid flower pollination algorithm, hybrid salp swarm algorithm, hybrid crow search algorithm, hybrid grey wolf optimization (HGWO), hybrid particle swarm optimization, and hybrid JAYA algorithm. This is to show the effect of the proposed MA versions on the performance of MLP. To evaluate the proposed MA versions for MLP, 15 classification benchmark problems with different size and complexity are used. The A β HC algorithm is invoked in the improvement loop of any MA version with a probability of B r parameter, which is investigated to monitor its effect on the behavior of the proposed MA versions. The B r setting which obtains the most promising results is then used to set the hybrid MA. The results show that the proposed MA versions excel the original algorithms. Moreover, HGWO outperforms all other MA versions in almost all the datasets. In a nutshell, MAs are a good choice for training MLP to produce results with high accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF