Back to Search Start Over

Scalable genetic programming by gene-pool optimal mixing and input-space entropy-based building-block learning

Authors :
Virgolin, M. (Marco)
Witteveen, C. (Cees)
Alderliesten, T. (Tanja)
Bosman, P.A.N. (Peter)
Virgolin, M. (Marco)
Witteveen, C. (Cees)
Alderliesten, T. (Tanja)
Bosman, P.A.N. (Peter)
Publication Year :
2017

Abstract

The Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) is a recently introduced model-based EA that has been shown to be capable of outperforming state-of-the-art alternative EAs in terms of scalability when solving discrete optimization problems. One of the key aspects of GOMEA's success is a variation operator that is designed to extensively exploit linkage models by effectively combining partial solutions. Here, we bring the strengths of GOMEA to Genetic Programming (GP), introducing GP-GOMEA. Under the hypothesis of having little problem-specific knowledge, and in an effort to design easy-to-use EAs, GP-GOMEA requires no parameter specification. On a set of well-known benchmark problems we find that GP-GOMEA outperforms standard GP while being on par with more recently introduced, state-of-the-art EAs. We furthermore introduce Input-space Entropy-based Building-block Learning (IEBL), a novel approach to identifying and encapsulating relevant building blocks (subroutines) into new terminals and functions. On problems with an inherent degree of modularity, IEBL can contribute to compact solution representations, providing a large potential for knock-on effects in performance. On the difficult, but highly modular Even Parity problem, GP-GOMEA+IEBL obtains excellent scalability, solving the 14-bit instance in less than 1 hour.

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1251875280
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1145.3071178.3071287