Back to Search
Start Over
A Genetic Algorithm with Tree-structured Mutation for Hyperparameter Optimisation of Graph Neural Networks
- Source :
- CEC
- Publication Year :
- 2021
- Publisher :
- arXiv, 2021.
-
Abstract
- In recent years, graph neural networks (GNNs) have gained increasing attention, as they possess the excellent capability of processing graph-related problems. In practice, hyperparameter optimisation (HPO) is critical for GNNs to achieve satisfactory results, but this process is costly because the evaluations of different hyperparameter settings require excessively training many GNNs. Many approaches have been proposed for HPO, which aims to identify promising hyperparameters efficiently. In particular, the genetic algorithm (GA) for HPO has been explored, which treats GNNs as a black-box model, of which only the outputs can be observed given a set of hyperparameters. However, because GNN models are sophisticated and the evaluations of hyperparameters on GNNs are expensive, GA requires advanced techniques to balance the exploration and exploitation of the search and make the optimisation more effective given limited computational resources. Therefore, we proposed a tree-structured mutation strategy for GA to alleviate this issue. Meanwhile, we reviewed the recent HPO works, which gives room for the idea of tree-structure to develop, and we hope our approach can further improve these HPO methods in the future.
- Subjects :
- Hyperparameter
FOS: Computer and information sciences
Computer Science - Machine Learning
Graph neural networks
Computer science
Process (engineering)
business.industry
Computer Science - Neural and Evolutionary Computing
Machine learning
computer.software_genre
Evolutionary computation
Machine Learning (cs.LG)
Set (abstract data type)
Tree (data structure)
Genetic algorithm
Mutation (genetic algorithm)
Artificial intelligence
Neural and Evolutionary Computing (cs.NE)
business
computer
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- CEC
- Accession number :
- edsair.doi.dedup.....b2c8fe961b0a7fa990c02da356b97a3b
- Full Text :
- https://doi.org/10.48550/arxiv.2102.11995