Back to Search Start Over

A Robbins--Monro Sequence That Can Exploit Prior Information For Faster Convergence

Authors :
Liu, Siwei
Ma, Ke
Goetz, Stephan M.
Publication Year :
2024

Abstract

We propose a new method to improve the convergence speed of the Robbins-Monro algorithm by introducing prior information about the target point into the Robbins-Monro iteration. We achieve the incorporation of prior information without the need of a -- potentially wrong -- regression model, which would also entail additional constraints. We show that this prior-information Robbins-Monro sequence is convergent for a wide range of prior distributions, even wrong ones, such as Gaussian, weighted sum of Gaussians, e.g., in a kernel density estimate, as well as bounded arbitrary distribution functions greater than zero. We furthermore analyse the sequence numerically to understand its performance and the influence of parameters. The results demonstrate that the prior-information Robbins-Monro sequence converges faster than the standard one, especially during the first steps, which are particularly important for applications where the number of function measurements is limited, and when the noise of observing the underlying function is large. We finally propose a rule to select the parameters of the sequence.<br />Comment: 26 pages, 5 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.03206
Document Type :
Working Paper