Back to Search Start Over

Scalable Multi-Output Gaussian Processes with Stochastic Variational Inference

Authors :
Jiang, Xiaoyu
Georgaka, Sokratia
Rattray, Magnus
Alvarez, Mauricio A.
Publication Year :
2024

Abstract

The Multi-Output Gaussian Process is is a popular tool for modelling data from multiple sources. A typical choice to build a covariance function for a MOGP is the Linear Model of Coregionalization (LMC) which parametrically models the covariance between outputs. The Latent Variable MOGP (LV-MOGP) generalises this idea by modelling the covariance between outputs using a kernel applied to latent variables, one per output, leading to a flexible MOGP model that allows efficient generalization to new outputs with few data points. Computational complexity in LV-MOGP grows linearly with the number of outputs, which makes it unsuitable for problems with a large number of outputs. In this paper, we propose a stochastic variational inference approach for the LV-MOGP that allows mini-batches for both inputs and outputs, making computational complexity per training iteration independent of the number of outputs.<br />Comment: none

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.02476
Document Type :
Working Paper