Back to Search Start Over

PAC-Bayesian Meta-Learning: From Theory to Practice

Authors :
Rothfuss, Jonas
Josifoski, Martin
Fortuin, Vincent
Krause, Andreas
Publication Year :
2022
Publisher :
arXiv, 2022.

Abstract

Meta-Learning aims to accelerate the learning on new tasks by acquiring useful inductive biases from related data sources. In practice, the number of tasks available for meta-learning is often small. Yet, most of the existing approaches rely on an abundance of meta-training tasks, making them prone to overfitting. How to regularize the meta-learner to ensure generalization to unseen tasks, is a central question in the literature. We provide a theoretical analysis using the PAC-Bayesian framework and derive the first bound for meta-learners with unbounded loss functions. Crucially, our bounds allow us to derive the PAC-optimal hyper-posterior (PACOH) - the closed-form-solution of the PAC-Bayesian meta-learning problem, thereby avoiding the reliance on nested optimization, giving rise to an optimization problem amenable to standard variational methods that scale well. Our experiments show that, when instantiating the PACOH with Gaussian processes and Bayesian Neural Networks as base learners, the resulting methods are more scalable, and yield state-of-the-art performance, both in terms of predictive accuracy and the quality of uncertainty estimates. Finally, thanks to the principled treatment of uncertainty, our meta-learners can also be successfully employed for sequential decision problems.<br />Comment: 50 pages

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....242d4aed96cd3a76cf7b02ef6f27af2c
Full Text :
https://doi.org/10.48550/arxiv.2211.07206