Back to Search Start Over

Graph Metanetworks for Processing Diverse Neural Architectures

Authors :
Lim, Derek
Maron, Haggai
Law, Marc T.
Lorraine, Jonathan
Lucas, James
Publication Year :
2023

Abstract

Neural networks efficiently encode learned information within their parameters. Consequently, many tasks can be unified by treating neural networks themselves as input data. When doing so, recent studies demonstrated the importance of accounting for the symmetries and geometry of parameter spaces. However, those works developed architectures tailored to specific networks such as MLPs and CNNs without normalization layers, and generalizing such architectures to other types of networks can be challenging. In this work, we overcome these challenges by building new metanetworks - neural networks that take weights from other neural networks as input. Put simply, we carefully build graphs representing the input neural networks and process the graphs using graph neural networks. Our approach, Graph Metanetworks (GMNs), generalizes to neural architectures where competing methods struggle, such as multi-head attention layers, normalization layers, convolutional layers, ResNet blocks, and group-equivariant linear layers. We prove that GMNs are expressive and equivariant to parameter permutation symmetries that leave the input neural network functions unchanged. We validate the effectiveness of our method on several metanetwork tasks over diverse neural network architectures.<br />Comment: 29 pages. v2 updated experimental results and details

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.04501
Document Type :
Working Paper