Back to Search Start Over

Learning Gaussian Processes by Minimizing PAC-Bayesian Generalization Bounds

Authors :
Reeb, David
Doerr, Andreas
Gerwinn, Sebastian
Rakitsch, Barbara
Source :
Advances in Neural Information Processing Systems 31 (Proceedings of the NeurIPS Conference 2018), https://papers.nips.cc/paper/7594-learning-gaussian-processes-by-minimizing-pac-bayesian-generalization-bounds
Publication Year :
2018

Abstract

Gaussian Processes (GPs) are a generic modelling tool for supervised learning. While they have been successfully applied on large datasets, their use in safety-critical applications is hindered by the lack of good performance guarantees. To this end, we propose a method to learn GPs and their sparse approximations by directly optimizing a PAC-Bayesian bound on their generalization performance, instead of maximizing the marginal likelihood. Besides its theoretical appeal, we find in our evaluation that our learning method is robust and yields significantly better generalization guarantees than other common GP approaches on several regression benchmark datasets.<br />Comment: 11 pages main text, 12 pages appendix. v2: minor changes, new NeurIPS style file. Final camera-ready version submitted to NeurIPS 2018

Details

Database :
arXiv
Journal :
Advances in Neural Information Processing Systems 31 (Proceedings of the NeurIPS Conference 2018), https://papers.nips.cc/paper/7594-learning-gaussian-processes-by-minimizing-pac-bayesian-generalization-bounds
Publication Type :
Report
Accession number :
edsarx.1810.12263
Document Type :
Working Paper