Back to Search Start Over

On the Generalization of PINNs outside the training domain and the Hyperparameters influencing it

Authors :
Bonfanti, Andrea
Santana, Roberto
Ellero, Marco
Gholami, Babak
Publication Year :
2023

Abstract

Physics-Informed Neural Networks (PINNs) are Neural Network architectures trained to emulate solutions of differential equations without the necessity of solution data. They are currently ubiquitous in the scientific literature due to their flexible and promising settings. However, very little of the available research provides practical studies that aim for a better quantitative understanding of such architecture and its functioning. In this paper, we perform an empirical analysis of the behavior of PINN predictions outside their training domain. The primary goal is to investigate the scenarios in which a PINN can provide consistent predictions outside the training area. Thereinafter, we assess whether the algorithmic setup of PINNs can influence their potential for generalization and showcase the respective effect on the prediction. The results obtained in this study returns insightful and at times counterintuitive perspectives which can be highly relevant for architectures which combines PINNs with domain decomposition and/or adaptive training strategies.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.07557
Document Type :
Working Paper