Back to Search Start Over

A Continuous Relaxation for Discrete Bayesian Optimization

Authors :
Michael, Richard
Bartels, Simon
González-Duque, Miguel
Zainchkovskyy, Yevgen
Frellsen, Jes
Hauberg, Søren
Boomsma, Wouter
Publication Year :
2024

Abstract

To optimize efficiently over discrete data and with only few available target observations is a challenge in Bayesian optimization. We propose a continuous relaxation of the objective function and show that inference and optimization can be computationally tractable. We consider in particular the optimization domain where very few observations and strict budgets exist; motivated by optimizing protein sequences for expensive to evaluate bio-chemical properties. The advantages of our approach are two-fold: the problem is treated in the continuous setting, and available prior knowledge over sequences can be incorporated directly. More specifically, we utilize available and learned distributions over the problem domain for a weighting of the Hellinger distance which yields a covariance function. We show that the resulting acquisition function can be optimized with both continuous or discrete optimization algorithms and empirically assess our method on two bio-chemical sequence optimization tasks.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.17452
Document Type :
Working Paper