Back to Search Start Over

Physics-Informed Guided Disentanglement in Generative Networks.

Authors :
Pizzati F
Cerri P
de Charette R
Source :
IEEE transactions on pattern analysis and machine intelligence [IEEE Trans Pattern Anal Mach Intell] 2023 Aug; Vol. 45 (8), pp. 10300-10316. Date of Electronic Publication: 2023 Jun 30.
Publication Year :
2023

Abstract

Image-to-image translation (i2i) networks suffer from entanglement effects in presence of physics-related phenomena in target domain (such as occlusions, fog, etc), lowering altogether the translation quality, controllability and variability. In this paper, we propose a general framework to disentangle visual traits in target images. Primarily, we build upon collection of simple physics models, guiding the disentanglement with a physical model that renders some of the target traits, and learning the remaining ones. Because physics allows explicit and interpretable outputs, our physical models (optimally regressed on target) allows generating unseen scenarios in a controllable manner. Secondarily, we show the versatility of our framework to neural-guided disentanglement where a generative network is used in place of a physical model in case the latter is not directly accessible. Altogether, we introduce three strategies of disentanglement being guided from either a fully differentiable physics model, a (partially) non-differentiable physics model, or a neural network. The results show our disentanglement strategies dramatically increase performances qualitatively and quantitatively in several challenging scenarios for image translation.

Details

Language :
English
ISSN :
1939-3539
Volume :
45
Issue :
8
Database :
MEDLINE
Journal :
IEEE transactions on pattern analysis and machine intelligence
Publication Type :
Academic Journal
Accession number :
37028384
Full Text :
https://doi.org/10.1109/TPAMI.2023.3257486