Back to Search Start Over

Connecting Hamilton--Jacobi partial differential equations with maximum a posteriori and posterior mean estimators for some non-convex priors

Authors :
Darbon, Jérôme
Langlois, Gabriel P.
Meng, Tingwei
Publication Year :
2021

Abstract

Many imaging problems can be formulated as inverse problems expressed as finite-dimensional optimization problems. These optimization problems generally consist of minimizing the sum of a data fidelity and regularization terms. In [23,26], connections between these optimization problems and (multi-time) Hamilton--Jacobi partial differential equations have been proposed under the convexity assumptions of both the data fidelity and regularization terms. In particular, under these convexity assumptions, some representation formulas for a minimizer can be obtained. From a Bayesian perspective, such a minimizer can be seen as a maximum a posteriori estimator. In this chapter, we consider a certain class of non-convex regularizations and show that similar representation formulas for the minimizer can also be obtained. This is achieved by leveraging min-plus algebra techniques that have been originally developed for solving certain Hamilton--Jacobi partial differential equations arising in optimal control. Note that connections between viscous Hamilton--Jacobi partial differential equations and Bayesian posterior mean estimators with Gaussian data fidelity terms and log-concave priors have been highlighted in [25]. We also present similar results for certain Bayesian posterior mean estimators with Gaussian data fidelity and certain non-log-concave priors using an analogue of min-plus algebra techniques.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2104.11285
Document Type :
Working Paper