Back to Search Start Over

A hybrid genetic-particle swarm optimization algorithm for multi-constraint optimization problems.

Authors :
Duan, Bosong
Guo, Chuangqiang
Liu, Hong
Source :
Soft Computing - A Fusion of Foundations, Methodologies & Applications. Nov2022, Vol. 26 Issue 21, p11695-11711. 17p.
Publication Year :
2022

Abstract

This paper presents a new hybrid genetic-particle swarm optimization (GPSO) algorithm for solving multi-constrained optimization problems. This algorithm is different from the traditional GPSO algorithm, which adopts genetic algorithm (GA) and particle swarm optimization (PSO) in series, and it combines PSO and GA through parallel architecture, so as to make full use of the high efficiency of PSO and the global optimization ability of GA. The algorithm takes PSO as the main body and runs PSO at the initial stage of optimization, while GA does not participate in operation. When the global best value (gbest) does not change for successive generations, it is assumed that it falls into local optimum. At this time, GA is used to replace PSO for particle selection, crossover and mutation operations to update particles and help particles jump out of local optimum. In addition, the GPSO adopts adaptive inertia weight, adaptive mutation parameters and multi-point crossover operation between particles and personal best value (pbest) to improve the optimization ability of the algorithm. Finally, this paper uses a nonlinear constraint problem (Himmelblau's nonlinear optimization problem) and three structural optimization problems (pressure vessel design problem, the welded beam design problem and the gear train design problem) as test functions and compares the proposed GPSO with the traditional GPSO, dingo optimization algorithm, whale optimization algorithm and grey wolf optimizer. The performance evaluation of the proposed algorithm is carried out by using the evaluation indexes such as best value, mean value, median value, worst value, standard deviation, operation time and convergence speed. The comparison results show that the proposed GPSO has obvious advantages in finding the optimal value, convergence speed and time overhead. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14327643
Volume :
26
Issue :
21
Database :
Academic Search Index
Journal :
Soft Computing - A Fusion of Foundations, Methodologies & Applications
Publication Type :
Academic Journal
Accession number :
159440858
Full Text :
https://doi.org/10.1007/s00500-022-07489-8