1. Four algorithms to construct a sparse kriging kernel for dimensionality reduction.
- Author
-
Blanchet-Scalliet, Christophette, Helbert, Céline, Ribaud, Mélina, and Vial, Céline
- Subjects
DIMENSION reduction (Statistics) ,KRIGING ,MACHINE learning ,GAUSSIAN processes ,TENSOR products ,ALGORITHMS ,COMPUTER programming ,PARAMETER estimation - Abstract
In the context of computer experiments, metamodels are largely used to represent the output of computer codes. Among these models, Gaussian process regression (kriging) is very efficient see e.g Snelson (Flexible and efficient Gaussian process models for machine learning. ProQuest LLC, Ann Arbor, MI. Thesis (Ph.D.)–University of London, University College London, London, 2008). In high dimension that is with a large number of input variables, but with few observations, the estimation of the parameters with a classical anisotropic kriging can be completely inaccurate. Because there are equal numbers of ranges and input variables the optimization space becomes too large compared to available information. One way to overcome this drawback is to use an isotropic kernel that only depends on one parameter. However this model is too restrictive. The aim of this paper is twofold. Our first objective is to propose a smooth kernel with as few parameters as warranted. We introduce a kernel which is a tensor product of few isotropic kernels built on well-chosen subgroup of variables. The main difficulty is to find the number and the composition of the groups. Our second objective is to propose algorithmic strategies to overcome this difficulty. Four forward strategies are proposed. They all start with the simplest isotropic kernel and stop when the best model according to BIC criterion is found. They all show very good accuracy results on simulation test cases. But one of them is more efficient. Tested on a real data set, our kernel shows very good prediction results. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF